Justification

Volume I NAEP Background Cog Labs for 2013.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf


National Assessment of Educational Progress



Volume I

Supporting Statement



Request for Clearance for Cognitive Interview Study of Background Questions for Students, Teachers, and School Administrators

(Reading, Mathematics, Civics, Geography, U.S. History, and Technology and Engineering Literacy)


OMB# 1850-0803 v.57



Shape1







September 1, 2011



Volume I: Supporting Statement




  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803). This generic clearance provides for NCES to conduct various procedures (such as field tests and cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.



  1. Background and Study Rationale

As required by the National Assessment Governing Board (Governing Board), the National Center for Education Statistics (NCES) of the U.S. Department of Education will conduct nationwide assessments at grades 4, 8, and 12 for the National Assessment of Educational Progress (NAEP). In addition to assessing subject-area achievement, NAEP collects questionnaire data to provide context for the reporting and interpretation of assessment results. This questionnaire data comes from three respondent types: students, teachers, and school administrators. NAEP questionnaires serve to fulfill reporting requirements of federal legislation and to provide a context for reporting student performance.


Periodically, NCES will add, revise, or delete questions from existing subject-specific background questionnaires. These modifications aim to improve questionnaire quality, replace or drop outdated questions, and collect data on new contextual factors that are expected to be associated with academic achievement. Questionnaires for 2014 Civics, 2014 Geography, 2014 U.S. History, 2015 Reading, and 2015 Mathematics have undergone a systematic review process that resulted in the addition and revision of several questionnaire items. These items have been reviewed by expert panels and NCES. In 2013, NCES will pilot test these new and revised subject-specific background questions with a large, national sample of students, teachers, and school administrators. Prior to the 2013 pilot, these new and revised items will undergo cognitive interview testing. In 2014 or 2015 (as appropriate per subject), these questionnaires will be administered in paper and pencil administrations. Consequently, the mode in which the respondents will be presented these questionnaire items for the cognitive interviews will also be paper and pencil.


In addition to the existing subject-specific background questionnaires being updated, NCES is developing a new Technology and Engineering Literacy (TEL) assessment and questionnaire for an eighth-grade only administration in 2014. The new TEL questionnaire items have been reviewed by expert panels and NCES. In 2013, NCES will pilot test these new items along with the aforementioned subject-specific background questionnaires. However, unlike the other subject-specific background questionnaires in this study, the student TEL assessment and questionnaire will be administered on computer in 2014. Consequently, the mode in which the student respondents will be presented the TEL items for this study will be on a computer. The items will be presented in an HTML preview, demonstrating the look and feel of the items as they will appear in the actual pilot administration.1 The school administrator questionnaire will be administered on paper, but similar to all NAEP school questionnaires, administrators will have the option to complete the questionnaire online.


This cognitive interview study investigates the cognitive processes that respondents use to answer survey questions. We are particularly interested in respondents’ ability to comprehend the question and provide a valid response. Our goal is to identify problems and limitations with the questions prior to the 2013 pilot. Early identification of such concerns, prior to administration to a large number of respondents, will increase the quality of the questionnaire by reducing potentially confusing language or improving response categories.


In cognitive interviews, an interviewer uses a structured protocol in a one-on-one interview using two methods: think-aloud interviewing and verbal probing techniques. With think-aloud interviewing, respondents are explicitly instructed to "think aloud" (i.e., describe what they are thinking) as they figure out their answers to the survey questions. The interviewer reads each question to the respondent, and then records the cognitive processes that the respondent uses in arriving at an answer to the question. With verbal probing techniques, the interviewer asks probing questions, as necessary, to clarify points that are not evident from the “think aloud” process. These probes might include, for example, asking the respondent to rephrase the question in their own words or assess whether the response categories are relevant. Refer to Volume II for the specific protocols that will be used in this study.


Cognitive interview studies are largely observational. Although the sample will include a mix of student characteristics, it does not explicitly measure differences by those characteristics. The largely qualitative data collected will be mainly verbal reports in response to probes or from think-aloud tasks, in addition to volunteered comments. The objective is to identify and correct problems of ambiguity or misunderstanding, or other difficulties respondents have answering questions. The result should be a set of questionnaires that are easier to understand and therefore less burdensome for respondents while also yielding more accurate information.



  1. Study Design and Context

Sampling Plan

Existing research and practice have failed to offer a methodological or practical consensus regarding the minimum or optimal sample size necessary to provide valid results for background question development.2 Nonetheless, a sample size of five to fifteen individuals has become the standard. Several researchers have confirmed the standard of five as the minimum number of participants per subgroup for analysis (i.e., cell) for the purposes of exploratory cognitive interviewing for question development.3 Other researchers have indicated that although a sample size of five per cell will likely identify major problems with a question, more is better, up to approximately fifteen per cell.4


The recommended sample sizes for this study are based on an assessment of the number of items and the complexity of each item. The new and revised items on existing subject-specific questionnaires are very similar in content and style to existing NAEP questionnaire items within the subjects being assessed. Therefore, complexity is minimal. Consequently, we will interview 10 student respondents per cell and 5 each for teachers and principals per cell. The cell or subgroup of analysis for these existing subject-specific questionnaire items is grade level. Note that respondents will answer questions across the different subject-areas that are being revised (i.e., Civics, Geography, U.S. History, Reading, and Mathematics). Therefore, to test these items adequately, there will be 55 cognitive interviews total across three respondent types (students, teachers, and school administrators) and three grade levels (4, 8, and 12) as summarized in Table 1 below.


Table 1. Sample Size: New and Revised Existing Subject-Specific Questionnaire Items


Grade 4

Grade 8

Grade 12

Students

10

10

10

Teachers

5

5

N/A

School Administrators

5

5

5

Note: There is no teacher questionnaire being administered at Grade 12 among the 2013 pilot questionnaires.


TEL is a new questionnaire in an emerging field of inquiry. Therefore, we want to maximize identification of major problems. Consequently, we will interview 15 respondents per cell. Among schools, we do not anticipate any differences between potential school administrator subgroups (e.g., principals at large versus small schools) in being able to comprehend or answer the questions that will be developed. Therefore, we propose one cell for all school administrators. However, research suggests a socioeconomic status (SES) digital divide in access to technology, which might lead to differences between low- and mid-SES students in comprehending terminology or questions due to differences in familiarity with technology and engineering. Therefore, we propose two cells (i.e., low-SES and mid/high-SES) for eighth-grade students.


There are such a large number of new items (see Table 3) being tested for the TEL student questionnaire that the items will be split in half to be administered to two sets of students to ensure all of the items are tested with 15 low-SES students and 15 mid/high-SES eighth-grade students. Therefore, to test these TEL items adequately, there will be 75 cognitive interviews total across two respondent types (students and school administrators) as summarized in Table 2.


Table 2. Sample Size: New TEL Questionnaire Items


Per Cell Sample Size

Cells

Interview Sets

Total Respondents

Students

15

2

2

60

School Administrators

15

1

1

15

Note: There is no teacher questionnaire being administered in TEL.


Recruitment and Sample Characteristics: Existing Subject-Specific Questionnaire Items

Educational Testing Service (ETS), the NAEP contractor for question development, will initially contact (through its Data Collection Services group) schools across five Mid-Atlantic states to participate in this study (see Appendix G for the sample recruitment script). School administrators who agree to participate will provide interview space as well as recruit and identify individual students and teachers for interviews. ETS will recruit no more than two students or educators per grade per school.


Although direct recruiting of parents of student participants, teachers, or school administrators is not planned or expected for existing subject-specific cognitive interviews, we have included scripts and processes in the event that additional recruitment is necessary to achieve the required sample characteristics (see Appendices J, L, and N for the sample recruitment scripts).


ETS will confirm the interview date, time, and location with participating schools, guardians of the participating students (for students not recruited through schools), as well as individually recruited teachers and school administrators.


Teachers, school administrators, and the guardians of the participating students (or students themselves, if over 18 years old) will be sent consent forms in advance of the cognitive interviews (See Appendices A- E for consent forms). Participants will submit signed consent forms to the interviewer prior to the interview.


The recruiting process is designed to yield a sample of participants that comply with the following criteria:

  • Recruit a study sample of 10 fourth-grade, 10 eighth-grade, and 10 twelfth-grade students with the following characteristics:

    • Mix of gender

    • Mix of race/ethnicity (Black, Asian, White, Hispanic)

    • Mix of socioeconomic background (low-SES, mid/high-SES)

  • Recruit a study sample of 5 fourth-grade and 5 eighth-grade teachers; 5 fourth-grade, 5 eighth-grade, and 5 twelfth-grade school administrators with the following characteristics:

    • School population that includes fourth-grade students, or eighth-grade students, or twelfth-grade students, as appropriate.

    • Mix of school sizes

    • Mix of school socioeconomic demographics



Recruitment and Sample Characteristics: TEL

EurekaFacts, a subcontractor to ETS, will recruit student participants for this through their Washington, D.C. metropolitan area participant database and contacts within organizations and groups that can serve as recruitment partners (e.g. parent teacher organizations, community organizations) for the upcoming study (see Appendices I and K for the sample recruitment scripts). EurekaFacts also uses on-site recruiting by distributing flyers/bookmarks, canvassing, and having representatives available to talk to parents, educators, and community organizers throughout the community at appropriate local events, school fairs, etc. (see Appendix P for the sample flyers and the sample conversation script). Parents of students expressing interest in participation will be contacted for parental permission. At that time, parents will be screened using a pre-approved screener script (see Appendix K).


EurekaFacts will recruit school administrators for this study through their Washington, D.C. metropolitan area participant database and contacts within organizations and groups that can serve as recruitment partners (see Appendices M and O for the sample recruitment scripts). In addition, EurekaFacts will also use referrals and in-person visits to schools (if necessary) to ensure a sample of schools with a diverse range of technology and engineering curriculum offerings. If needed, EurekaFacts will identify school administrators from targeted contact lists that may be purchased from reputable third-party vendors, such as Market Data Retrieval or the American Association of School Administrators.


EurekaFacts will confirm the interview date, time, and location with parents and school administrators (see Appendices K and O). Student interviews will take place at the EurekaFacts cognitive interview laboratory in Rockville, Maryland. School administrator interviews will also take place at the EurekaFacts cognitive interview laboratory or via phone.


School administrators and the guardians of the participating students will be sent consent forms in advance of the cognitive interviews (see Appendices B and E for consent forms). School administrators and the guardians of the participating students will submit signed consent forms to the interviewer prior to the interview being conducted.


The recruiting process is designed to yield a sample of participants that comply with the following criteria:

  • Recruit a study sample of 60 eighth-grade students with the following characteristics:

    • Mix of gender

    • Mix of race/ethnicity (Black, Asian, White, Hispanic)

    • Mix of socioeconomic background (30 low-SES, 30 mid/high-SES)

  • Recruit a study sample of 15 school administrators with the following characteristics:

    • School population includes eighth-grade students

    • Mix of school sizes

    • Mix of school socioeconomic demographics

    • Mix of schools with regard to student participation rates in information technology and design/engineering courses



Question Overview and Estimated Interview Length

Interviewers will be using questionnaires and interview protocols related to new and revised items across the aforementioned assessment areas (see volume II). Questions to be tested in the study are similar in both content and question types to existing NAEP background questions. Student questionnaires collect information on students’ demographic characteristics, classroom experiences, and educational support. Teacher questionnaires gather data on teacher training and instructional practices. School questionnaires gather information on school policies and characteristics.


Table 3 displays the number of background questions being evaluated. There are discrete items (multiple choice, single selection) and matrix items that include a single item stem and multiple sub-items. In the table, the number of items is presented, followed in parenthesis by the total number of discrete items and sub-items (that make up parts of a matrix). Based on the number of items, associated item-specific probes, and age of participant, Table 3 also displays an estimate of interview length consistent with field experience from past studies, including recent NAEP background cognitive interview studies conducted in February 2011. All existing subject-specific questionnaire interviews will be scheduled for no more than 60 minutes and TEL questionnaire interviews will be scheduled for no more than 90 minutes.


Table 3. Description of Interviews by Respondent Type


Grade 4 # of Items

Grade 4 Interview Length

Grade 8 # of Items

Grade 8 Interview Length

Grade 12 # of Items

Grade 12 Interview Length

Student Existing-subject Interview

8(8)

45

7(31)

45

2(9)

20

Teacher Existing-subject Interview:

9(48)

40

9(55)

45



Principal Existing-subject Interview: Number of Items

16(79)

55

16(79)

55

7(47)

35

TEL Student Interview: [Half: 31(149)]



15 (75)

85



TEL School Interview: Number of Items



21(102)

90




Following the first few interviews identified problems with the questions will be addressed by modifying the questions and then testing the modified question with the remaining sample. This process, known as iterative testing, is a major strength and standard practice in the field of cognitive interview testing.5 In order to respect respondent burden, we will not proceed with modification and re-testing of any question that would result in increasing respondent burden.


  1. Data Collection Process

ETS will conduct the existing subject-specific questionnaire interviews, while EurekaFacts (see section 5) will conduct the TEL interviews. Both ETS and EurekaFacts will ensure that qualified interviewers are available to conduct the interviews. Interviewers will be trained on the cognitive interviewing techniques of the protocol. Interview protocols are based on the generic protocol structure described in Volume II. The interviews will focus on how students, teachers, and school administrators answer questions about themselves, their classroom experiences, and their schools.


Cognitive Interview Process

Participants will first be welcomed, introduced to the interviewer and the observer (if an in-room observer is present), and told they are there to help answer questions about how people answer survey questions.


Participants will be reassured that their participation is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002, 20 U.S.C §9573]. Interviewers will explain the think-aloud process, conduct a practice question, and then participants will answer questions verbally.


Interviewers will use several different cognitive interviewing techniques, including general think-aloud and question-specific probes, observation, and debriefing questions. Volume II of this submission includes the cognitive interview protocols to be used in the study.


Units of Analysis

The key unit of analysis is the question. Questions will be analyzed across participants within grade and across grades, where applicable.


The types of data collected about the questions will include

think-aloud verbal reports;

behavioral coding (e.g., errors in reading question);

responses to generic probes;

responses to question-specific probes;

additional volunteered participant comments; and

debriefing questions.


A coding frame will be developed for the responses to think-aloud questions and other verbal reports. The frame will be designed to identify and code reflection of verbal responses and to identify problems associated with question comprehension, memory retrieval, judgment/estimation processes, and response processes. The draft coding frame will be modified and supplemented based on reviewing the initial cognitive interviews.


Analysis Plan

The general analytical approach will be to record interviewer notes and observable behaviors into spreadsheets and computer software to facilitate identification of questions with problems and the nature of those problems. Each type of data for a question will be examined both independently and in conjunction with question-specific features (e.g., sentence complexity, question type, and number of response choices) in order to determine whether a feature or an effect of a question is observed across multiple measures and/or across administrations of the question.


It is easy for problems that merit attention to be overwhelmed by the “background noise” of minor or idiosyncratic difficulties. Our analytical approach is designed to identify the questions that posed difficulties to respondents in ways that (1) appeared to reduce the validity of the response, (2) could be predicted to occur at a significant rate in a larger population, or (3) exhibited a logical basis for why the problem occurred. This approach will ensure that the data are analyzed in a way that is thorough and that will enhance identification of problems with questions and provide recommendations for fixing those problems.


A more detailed analysis plan will be developed as part of the project. Final reports of the study findings will be submitted to OMB along with the proposed questionnaires for pilot testing in 2013.


  1. Consultations Outside the Agency

Educational Testing Service (ETS)

ETS serves as the Item Development contractor on the NAEP project, developing cognitive and background items for NAEP assessments. ETS’ staff (both Research & Development and Assessment Development) will be involved in the recruitment, management, and conduct of the cognitive interviews for the existing subject-specific background questionnaire items, as well as the management of the cognitive interviews for the TEL questionnaire items.



EurekaFacts

EurekaFacts, LLC, is a well-regarded survey and business research firm located in Rockville, MD. EurekaFacts has significant experience successfully completing testing and improvement of data collection tools to ensure their reliability and validity for a wide range of Federal government agencies. Additionally, EurekaFacts has a well-established track record of working as a sub-contractor for ETS and successfully fulfilling all project specifications associated with testing of NAEP TEL items currently in development. EurekaFacts has an in-house cognitive testing facility, access and proficiency with specialized software used for cognitive and usability testing, and a staff of highly experienced cognitive interviewers, survey research methodologists, participant recruitment / field personnel, cognitive scientists and usability experts who will conduct and manage all activities associated with the TEL background cognitive interview process.


  1. Assurance of Confidentiality

NCES has policies and procedures that ensure privacy, security, and confidentiality, in compliance with the Education Sciences Reform Act of 2002 (20 U.S.C. §9573). This legislation ensures that security and confidentiality policies and procedures of all NCES studies, including the NAEP project, are in compliance with the Privacy Act of 1974 and its amendments, NCES confidentiality procedures, and the Department of Education ADP Security Manual.


Participation is voluntary. Written consent will be obtained from legal guardians of minor students, students age 18 or older, teachers, and school administrators before interviews are conducted (see Appendices A thru E for legal guardian consent, adult student, and teacher/school personnel consent forms, respectively).


Participants will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files and kept in a locked cabinet for the duration of the study and will be destroyed after the final report is submitted.


The interviews will be recorded. The only identification included on the files will be the participant ID. The recorded files will be secured for the duration of the study and will be destroyed after the final report is submitted.


  1. Justification for Sensitive Questions

Throughout the interview protocol development process, effort has been made to avoid asking for information that might be considered sensitive or offensive. Reviewers have identified and eliminated potential bias in questions. In addition, the background question development process includes sensitivity reviews before use in assessments.


  1. Estimate of Hourly Burden

The estimated burden for recruitment assumes attrition throughout the process. Initial contact and response is estimated at 3 minutes or .05 hours. The follow-up phone call to screen participants and/or answer any questions the participants (or their parents) have is estimated at 9 minutes or .15 hours per participant. The follow-up to confirm participation is estimated at 3 minutes or .05 hours. The consent form that parents of participating students and school administrators will need to complete is estimated at 5 minutes or .083 hours. The distribution of consent forms by school administrators to parents of students and teachers is estimated at 5 minutes or .083 hours. The time for teachers and school administrators to complete their own interview consent form for existing subject-specific questionnaire items has been incorporated into their scheduled interview time. All existing subject-specific questionnaire interviews will be scheduled for no more than 60 minutes and TEL questionnaire interviews will be scheduled for no more than 90 minutes.

Table 4. Estimate of Hourly Burden

Respondent

Hours per respondent

Number of respondents

Total hours

Parent and Student Recruitment

 

 

 

Initial contact

0.05

600

30

Follow-up via phone

0.15

120

18

Confirmation

0.05

60

3

Complete Consent form

0.083

90

8

School Administrator Recruitment (including Teacher Recruitment)

Initial contact

0.05

200

10

Follow-up via phone or email

0.15

100

15

Confirmation

0.05

50

3

Distribute Consent forms

0.083

70

6

Interviews

 

 

 

Grade 4 Students

1

10

10

Grade 8 Students

1 (ESS*)

10

10

Grade 8 Students

1.5 (TEL)

60

90

Grade 12 Students

0.5

10

5

Teachers

1

10

10

School Administrators

1 (ESS)

15

15

School Administrators

1.5 (TEL)

15

23

Total Burden Hours

 

 

256

* ESS- existing subject specific


  1. Estimate of Costs for Recruiting and Paying Respondents

For student interviews taking place within schools, schools will receive a $25 gift card for each student interviewed and each school observer (e.g., teacher’s aide). It is up to the schools to determine what to do with the gift cards.


For students that are directly recruited (for the TEL portion of the cog lab study), each participating student will receive a $25 gift card in compensation for time and effort. In addition, we are offering a gift card of $25 per parent to remunerate them for their time and to help offset the travel/transportation costs of bringing the participating student to and from the EurekaFacts or ETS cognitive laboratory site. Generic gift cards that can be used anywhere credit cards are accepted is the recommended incentive because low-income participants may not have bank accounts and check-cashing outlets can be burdensome in terms of fees.


Each participating teacher and school administrator respondent will receive a $40 gift card to thank them for the additional efforts they provide with regard to making interview space available, identifying and recruiting individual students, coordinating outreach to parents to obtain signed consent forms, etc.



  1. Cost to Federal Government

The following table provides the overall project cost estimates:



Table 5. Estimate of Costs

Activity

Provider

Estimated Cost

Design, preparation, conduct of cognitive interviews (including recruitment, incentive costs, data collection, analysis, and reporting) for existing subject-specific background questions

ETS

$ 121,528

Design, preparation, conduct of student and school administrator cognitive interviews (including recruitment, incentive costs, data collection, analysis, and reporting) for TEL background questions

EurekaFacts

$ 176,735

Totals


$ 298,263



  1. Schedule

The following table provides the schedule of milestones and deliverables:



Table 6. Project Schedule

Activity

Dates

Recruit participants

October-November 2011

Data collection, preparation, and coding

October-December 2011

Preliminary data analysis

December-January 2012

Preliminary study report

January 2012

Final study report

May 2012




1 The system interface to be used during the pilot administration will not be ready for the cognitive interviews, but the HTML previews demonstrate item functionality, look, and feel.

2See Almond, P. J., Cameto, R., Johnstone, C. J., Laitusis, C., Lazarus, S., Nagle, K., Parker, C. E., Roach, A. T., & Sato, E. (2009). White paper: Cognitive interview methods in reading test design and development for alternate assessments based on modified academic achievement standards (AA-MAS). Dover, NH: Measured Progress and Menlo Park, CA: SRI International.

3See Van Someren, M. W., Barnard, Y. F., & Sandberg, J. A. C. (1994). The think-aloud method: A practical guide to modeling cognitive processes. San Diego, CA: Academic Press.

4See Willis, G. (2005). Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage.

5 See Willis, G. Cognitive Interviewing: A Tool for Improving Questionnaire Design, 7.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleBackground Cog Lab OMB Submission V.1
SubjectNAEP BQ
AuthorDonnell Butler
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy