Volume I - 2021 eNAEP Pretesting Usability Study

Volume I-2021 eNAEP Pretesting_UsabilityStudy.docx

NCES System Clearance for Cognitive, Pilot, and Field Test Studies 2019-2022

Volume I - 2021 eNAEP Pretesting Usability Study

OMB: 1850-0803

Document [docx]
Download: docx | pdf

National Center for Education Statistics

National Assessment of Educational Progress



Volume I

Supporting Statement



National Assessment of Educational Progress (NAEP)

2021 eNAEP Pretesting and Usability Study



OMB# 1850-0803 v.266







April 2020

Volume I Table of Contents





Attachments


Volume II – Protocols and Questionnaires

Appendix A – Materials for Simulated Classroom Pretesting and Usability Testing

Appendix B – Materials for Field Trial Pretesting

Appendix C – Spanish Materials for Simulated Classroom Pretesting in Puerto Rico





  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB# 1850-0803), which provides for NCES to conduct various procedures (such as pilot tests, cognitive interviews, and usability studies) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments and procedures.

  1. Background and Study Rationale

The National Assessment of Educational Progress (NAEP) is a federally authorized survey, by the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622), of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is conducted by NCES, which is part of the Institute of Education Sciences, within the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the different subject areas and collect survey questionnaire (i.e., non-cognitive) data to provide context for the reporting and interpretation of assessment results.

Over the last few years, NAEP has been transitioning to digitally based assessments (DBA) that are administered on tablets using a test-delivery system developed for NAEP, known as eNAEP. The eNAEP system was successfully used in the 2015 and 2016 pilot assessments.1 The first operational use of the eNAEP system was in conjunction with the 2017 NAEP assessments. Enhancements have been made after each administration of eNAEP to address issues identified in the field, to make the system more user-friendly, and to allow for the assessment of additional content and scenario-based task item types.

The traditional NAEP design assesses each student in 60-minutes for one cognitive subject. In 2021, NAEP will begin to transition to a design in which students will take additional set(s) of cognitive items, sometimes from another subject area. This design will allow more information to be collected from each individual student, possibly reducing the number of overall students (and, thus, schools) that are required. In 2021, approximately 78% of students taking reading and mathematics will be assessed using the standard NAEP design of 60 cognitive minutes in one subject and approximately 22% of students will be assessed using a new design of 90 cognitive minutes. The two different designs will be administered in different schools. Given that 2021 is the initial transition year, this design will only be implemented in reading and mathematics; U.S. history and civics will be assessed via the traditional design of 60-minutes of cognitive items in one subject. Based on this new design for 2021, the plans below describe how they will conduct pretesting studies to test one-subject or two-subject sessions.

eNAEP Pretesting and Usability Testing Rationale

This request is to conduct a real-world test of the eNAEP system with students, allowing the system to be tested in the manner that will be used in the national study to help identify platform system issues at this stage of the software development process. The rationale for this study is based on lessons learned and issues encountered by students in the field during the 2019 assessments. It is believed that students use and interact with the platform system differently than adult quality control (QC) testers. Therefore, including students as part of the platform pretesting and QC process should allow for issues to be identified and addressed prior to the operational use of the system.

Three series of eNAEP platform pretesting will be conducted with students from grades 4 and 82. The first series of pretesting, referred to in this submission as Simulated Classroom Pretesting, will be held in a simulated classroom, while the second series of pretesting, referred to in this submission as the Field Trial, will be held in real schools, in a live classroom environment. The third series, referred to in this submission as Usability Testing, will be held in a simulated classroom, similar to the Simulated Classroom Pretesting sessions, only featuring one-on-one interviews with individual student participants.

Simulated Classroom Pretesting

During the Simulated Classroom Pretesting, up to 5 pretesting events will take place at the end of eNAEP development stage, known as “builds,” with sessions lasting up to 90 minutes for a one-subject session and up to 135 minutes for a two-subject session. This portion of the study is designed to test the updated platform and administrative procedures to identify any logistical challenges and address them prior to the 2021 NAEP administration. (see Table 1 for item content assignment matrix by grade). In Puerto Rico, students will participate in up to 2 pretesting events for mathematics in grades 4 and 8 only. Students will be recruited by ASPIRA, LLC’s coordinator through the schools already impacted by ASPIRA, LLC by other programs, or by reaching to new schools at Carolina. Coordinator will coordinate meetings with schools’ personnel and distribute flyers in all schools visited. ASPIRA, LLC will be responsible for coordinating all logistical matters for each session, conducting the pretesting, requesting parental consents, and providing gift cards to participants.

Build Number

Tentative Date

Number of Sessions

Build 3 Event

June 29-July 20, 2020 (possible alternative date is TBD)

Two sessions:

  • 25 students in gr 4 2-block session

  • 25 students in gr 8 2-block session

Build 4 Event

August 13-26, 2020

Four sessions:

  • 25 students in gr 4 2-block session

  • 25 students in gr 8 2-block session

  • 25 students in gr 8 2-block social studies session (US mainland only)

  • 40 students in gr 4 3-block session

  • 40 students in gr 8 3-block session

Field Trials

The Field Trial will be conducted in schools by NAEP field administration staff, replicating the actual testing conditions to the fullest extent possible. For the Field Trial study, 9 schools (four grade 4 schools; five grade 8 schools) will be recruited by NAEP State Coordinators. It will cover the cognitive blocks and questionnaires of NAEP 2021 digitally based subjects delivered on tablets to up to 170 4th grade and up to 220 8th grade students (see Table 1 for the Field Trial content assignment matrix). Teacher and school survey questionnaires will not be included as part of this study.


The procedures to be tested include:

  • Conducting pre-assessment activities such as contacting the school to confirm logistics;

  • Arriving on the day of the assessment with all necessary equipment;

  • Setting up, administering, and breaking down the assessment following standard NAEP procedures; and

  • Using the e-File and MyNAEP systems (see Appendix B) as would be done in an operational assessment.

Results from this study will not be publicly released and will be used to identify issues and generate solutions or workarounds in advance of the main NAEP 2021 administration.

Table 1. NAEP 2021 Simulated Classroom and Field Trial Testing Content Assignment Matrix by grade

Subject

Grade 4

Grade 8

Mathematics Operational

X

X

Reading Operational

X

X

Social Studies Operational


X

Usability Testing

NCES is interested in studying the intuitiveness and ease-of-use of tutorials, interface tools, and item types being developed for future NAEP assessments. Examples of recently developed tools include an in-system protractor and an in-system bar graph creator. Tools like these provide support to the student during the assessment and allow greater flexibility in testing cognitive constructs. As part of the process of designing new tools and item types, it is important to study the user experience they provide. User testing with students is crucial for making design decisions that result in tools that are truly useful and intuitive to use. User testing can reveal usability problems that were unanticipated by developers and designers. In comparison to previous usability studies3, this study will include a greater emphasis on new ways of interacting with items and new item types, as well as a tutorial.

This study will present students with prototypes of item types, tools, and a tutorial on touch-screen tablets. Students will be given assessment-related tasks to complete using the prototypes). Results will consist of task completion success rates using each of the control and information elements studied. These task completion success rates will be combined with qualitative information from ease-of-use surveys, exit questions, and facilitator observations to compile recommendations for modifications to items and tools. This is done to ensure that their implementation in upcoming assessments does not present barriers to construct validity. In addition, some of the information will be used to assess the usability of the hardware being used in the study. Screen size, keyboard, and trackpad responsiveness are just some of the hardware properties that affect usability. Information on ease-of-use of the hardware will be used to inform future decisions regarding selection of appropriate systems for NAEP testing.

Volume I of this submittal describes the purpose, design, sampling, burden, cost, and schedule information for the study. Volume II provides examples of the types of tasks that will be included in the protocol as well as the types of survey questions that will be administered in the study. The appendices contain recruitment materials, notifications, usability testing scripts, and thank you documents.

  1. Recruitment and Data Collection

Recruitment and Sample Characteristics for the Simulated Classroom Pretesting in mainland United States

For each round of mainland US pretesting, an NCES subcontractor for NAEP, EurekaFacts, will recruit a maximum of 50 students (25 grade 4 students; 25 grade 8 students for 2-block testing math and reading) for session 1 and a maximum of 465 students (25 grade 4 students; 25 grade 8 students for 2-block math and reading; 25 grade 8 students for 2-block social studies; 40 grade 4 students; 40 grade 8 students for 3-block testing math and reading) per round, for up to 3 rounds of pretesting. National and State assessment content will be tested in grades 4 and 8 (NAEP 2021 operational and pilot). The location for each event will either be at the EurekaFacts facility in Rockville, Maryland or another facility in the DC area that EurekaFacts obtains.

EurekaFacts will recruit participants for the pretesting study from the District of Columbia, Maryland, Virginia, and West Virginia. Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics. Students will be recruited to obtain the following criteria:

  • A mix of race/ethnicity (Black, Asian, White, Hispanic, etc.),

  • A mix of socioeconomic background,

  • A mix of urban/suburban/rural areas,

  • A mix of students requiring accommodations, and

  • A mix of Spanish-speaking students.

While EurekaFacts will use various outreach methods (see Appendices A1-A6) to recruit students to participate, the bulk of the recruitment will be conducted by telephone and will be based on acquisition of targeted mailing lists containing residential addresses and landline telephone listings. EurekaFacts will also use a participant recruitment strategy that integrates multiple outreach methods and resources such as newspaper and internet ads, community organizations (e.g., Boys and Girls Clubs, Parent-Teacher Associations), and mass media recruitment (e.g., postings on the EurekaFacts website).

Parents of students under 18-years of age (see Appendix A5 and A6) will be screened to ensure that the recruited students meet the criteria for participation in the study (i.e., that the students are from the targeted demographic groups outlined above). When recruiting participants, EurekaFacts staff will speak to the parent/legal guardian of the interested minor before starting the screening process. During this communication, the parent/legal guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort as well as the activities that it entails. After confirming that a participant is qualified, willing, and available to participate in this study, he or she will receive a confirmation email/letter. Written, informed parental consent (see Appendix A9) will be obtained for all respondents who are interested in participating in the study. Shortly after each of the eNAEP builds is released, a pretesting event with students will be held in a simulated or actual classroom.

Each Simulated Classroom Pretesting session will last up 90 minutes for one-subject session and up to 135 minutes for two-subject session, they will be structured as follows:

  • Each student will be asked to take the assessment under standard NAEP assessment conditions (up to 90 or 135 minutes). Westat is a NAEP contractor and will administer the session using standard procedures. Students will take the full assessment, including the tutorial, cognitive items and tasks, and the survey questionnaires.4

  • A group debrief (up to 10 minutes) will be conducted to solicit feedback from the students (see Volume II for the debriefing script).

As part of the assessment administration in all event sessions, students will take a set of survey questionnaires. The maximum time for the survey questionnaire component is 15 minutes (included in the 90 or 135-minute time estimated for each session). Students will take a “core” section regarding general student and contextual information and a subject-specific section. Volume II includes the library of possible student survey items to be administered.5 Not all of the items presented in Volume II will be administered in this Simulated Classroom Pretesting study. The number of items selected for each student will be appropriate to the time allocated. As the items for the 2021 NAEP administration are finalized throughout the development process, the final subset will be included in the eNAEP system for pretesting. As such, the earlier builds may include different items selected from the library in Volume II than the final build.



Recruitment and Sample Characteristics for the Simulated Classroom Pretesting in Puerto Rico

For each round of pretesting, an NCES subcontractor for NAEP, ASPIRA, LLC, LLC, will recruit a maximum of 50 students (25 grade 4 students; 25 grade 8 students for 2-block testing math) for session 1 and a maximum of 130 students (25 grade 4 students; 25 grade 8 students for 2-block math ; 40 grade 4 students; 40 grade 8 students for 3-block testing math). The location for each event will either be at ASPIRA, LLC, LLC’s facility in Carolina, Puerto Rico or another facility in the Carolina area that ASPIRA, LLC, LLC obtains.

ASPIRA, LLC, LLC will recruit participants for the pretesting study from the Carolina municipality. Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics. Students will be recruited to obtain the following criteria:

  • A mix of race/ethnicity,

  • A mix of socioeconomic background,

  • A mix of urban/suburban/rural areas,

  • A mix of students requiring accommodations



While ASPIRA, LLC, LLC will use various outreach methods (see Appendices C1-C6) to recruit students to participate, the bulk of the recruitment will be conducted by telephone and will be based on acquisition of targeted mailing lists containing residential addresses and landline telephone listings. ASPIRA, LLC will also use a participant recruitment strategy that integrates multiple outreach methods and resources such as newspaper and internet ads, other community organizations, and mass media recruitment (e.g., postings on the ASPIRA, LLC’s website).

Parents of students under 18-years of age (see Appendix C6) will be screened to ensure that the recruited students meet the criteria for participation in the study (i.e., that the students are from the targeted demographic groups outlined above). When recruiting participants, ASPIRA, LLC’s staff will speak to the parent/legal guardian of the interested minor before starting the screening process. During this communication, the parent/legal guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort as well as the activities that it entails. After confirming that a participant is qualified, willing, and available to participate in this study, he or she will receive a confirmation email/letter and phone call. Written, informed parental consent (see Appendix C8) will be obtained for all respondents who are interested in participating in the study. Shortly after each of the eNAEP builds is released, a pretesting event with students will be held in a simulated or actual classroom.

Each Simulated Classroom Pretesting session will last up 90 minutes for one-subject (2-block) session and up to 135 minutes for one-subject (3-block) session, they will be structured as follows:

  • Each student will be asked to take the assessment under standard NAEP assessment conditions (up to 90 or 135 minutes). Westat is a NAEP contractor who will administer the session using standard procedures. Students will take the full assessment, including the tutorial, cognitive items and tasks, and the survey questionnaires.6

  • A group debrief (up to 10 minutes) will be conducted to solicit feedback from the students (see Volume II for the debriefing script).

As part of the assessment administration in all event sessions, students will take a set of survey questionnaires. The maximum time for the survey questionnaire component is 15 minutes (included in the 90 or 135-minute time estimated for each session). Students will take a “core” section regarding general student and contextual information and a subject-specific section. Volume II includes the library of possible student survey items to be administered.7 Not all the items presented in Volume II will be administered in this Simulated Classroom Pretesting study. The number of items selected for each student will be appropriate to the time allocated. As the items for the 2021 NAEP administration are finalized throughout the development process, the final subset will be included in the eNAEP system for pretesting. As such, the earlier builds may include different items selected from the library in Volume II than the final build.

Recruitment and Sample Characteristics for the Field Trial

States will be asked to participate in the study on a voluntary basis. NAEP State Coordinators (“Coordinators”) in volunteer states will recruit schools in their state that are not part of the main NAEP 2021 sample (see Appendix B2 for a sample letter from a Coordinator to a school principal). Coordinators will leverage relationships within the state, including the Principal and Teacher Panels, to contact schools and identify those willing to participate in the study. The Coordinator will forward the contact information for participating schools to Westat.

A total of nine schools will participate in the study. While the study will seek participation from schools in southern states with various demographic characteristics, including a mix of urban/suburban/rural locations and students with a mix of race/ethnicity and socioeconomic backgrounds, detailed sampling requirements will not be targeted because of the small number of schools that will be asked to participate in the study. Additionally, schools that are within a relatively close proximity to each other within a state will be selected to ensure that two field administration staff members are able to conduct the study for all schools in the state. Private schools will not be recruited for this study.

Students will be selected to participate in the study via standard NAEP procedures, using the e-filing system for sampling (see Appendix B5 for a step-by-step overview of the tasks completed as part of the e-filing process). Either the sampled State Education Agency (SEA) or school will submit a list of all potential student participants, from which a random sample will be drawn, and each school will be notified of the list of selected student participants. A summary of the targeted student sample is provided below:

Grade

Assessment Content

Number of Schools

Number of Subject Matters

Number of Sessions

Students Per Session

Average Number of Students Per School

Total Students Per Grade

4

Reading/ Mathematics 2-block

1

2

1

50

50

50

4

Reading/ Mathematics 3-block

3

2

1

40

120

120

8

Reading/ Mathematics 2-block

1

2

1

50

50

50

8

Reading/ Mathematics 3-block

3

2

1

40

120

120

8

Social Studies

1

2

1

50

50

50

Total

-

9

-

5

-

-

390


After the sample is drawn and schools are notified of selected students, the subsequent tasks to prepare for the assessment are completed by the school coordinator in the MyNAEP system including:

  • Registering and Providing School Information

  • Submitting Student List/Sample

  • Reviewing and Verifying the List of Students Selected for NAEP

  • Completing SD/ELL Student Information

  • Notifying Parents

  • Updating Student List

  • Planning for Assessment Day and Encouraging Participation

  • Supporting Assessment Day Activities

See Appendix B6 for the full content of the MyNAEP system that will guide school coordinators through the preassessment tasks they will need to complete.

Each Field Trial session will last up to 170 minutes and will be structured as follows:

  • During each session, each student will be asked to take the assessment under standard NAEP assessment conditions (approximately up to 160 minutes including transition time and instructions). Westat will administer the session using standard procedures. Students will take the full assessment, including the tutorial, cognitive items and tasks, and the survey questionnaires.8

  • A group debrief (up to 10 minutes) will be conducted to solicit feedback from the students (see Volume II for the debriefing script).

Recruitment and Sample Characteristics for Usability Testing

EurekaFacts will recruit up to 74 participants from the greater Washington, DC/Baltimore metropolitan area using various outreach methods. These methods will include over-the-phone recruitment based on targeted mailing lists containing residential addresses and landline telephone listings, newspaper/internet ads, outreach to community organizations (e.g., Boys and Girls Clubs, Parent‐Teacher Associations), and mass media (e.g., postings on the EurekaFacts website). By using EurekaFacts for recruitment, ETS will be able to administer usability testing during school holidays, evenings, and weekends allowing for the total number of participants to be spread out over the course of the calendar year, rather than being confined to the school calendar.

When recruiting participants, EurekaFacts staff will first communicate with the parent/guardian of the interested minor. The parent/guardian will be informed of the objectives, purpose, and participation requirements of the study and the activities it entails. After confirming that a student is to participate, a confirmation e‐mail/letter will be sent and the informed parental/guardian consent for the minor’s participation will be obtained. Appendix A provides sample recruitment materials that will be used by EurekaFacts.9

While no demographic variables have been shown to affect outcomes in past usability studies, recruiters will make an effort to recruit a diverse sample of students in order to minimize systematic variance in the study sample. Though we strive to ensure that participants are as diverse as is practical, students chosen for the study will not be included or excluded based on demographic criteria. Given the potentially large number of interactions to be tested, up to 74 students will participate in user testing, spread across grades 4 and 8.

Pretesting Data Collection Process

Normal data collection will be enabled by the eNAEP systems, and any errors generated will be collected automatically by the system. Note that student responses will not be scored. In addition to the eNAEP systems recording information, administrators and observers from NCES, Westat, ETS, EurekaFacts (mainland US) and ASPIRA, LLC (Puerto Rico), will monitor the assessments and record notes detailing any issues encountered by the students, as well as what the students were doing at the time each issue occurred. In addition, observers may ask individual students for clarification of the actions he or she took prior to an issue or error occurring. Please see protocols in Volume II for specific questions that will be asked. Understanding and documenting what caused the system error is necessary in order to have enough information for staff to replicate the error and develop a fix for it.

The Simulated Classroom Pretesting sessions will be audio and/or video recorded to capture information regarding any student actions that resulted in system errors or issues.

Usability Study Data Collection Process

User testing will be conducted in several sessions over the course of the year, as part of an iterative process of design and testing of new and revised DBA tools, tutorial, and items developed over that period. DBA developers and designers will submit prototypes designed to test specific interactions, and these prototypes will be used in subsequent user testing groups. User testing data will be reported back to the developers and NCES as they are collected so that decisions regarding design modifications can be made in a timely manner. Modified features or items may then be included in a later user testing session to validate the usability of the changes.

A variety of subject areas will be included, not to test the subject content, but to test interactions that may be unique to items for that subject. For example, math items may be used to test an on-screen calculator or equation editor, as that subject area uses those two particular interactions. Reading items may be used to test different passage layouts and panel controls that are unique to reading items.

In addition to the multiple item types tested using prototypes, different participant groups may be tested using different touch-screen tablets, in order to test the impact of different hardware or operating systems on the usability and the interactions.

Each student will perform the study tasks during a one-on-one session with a facilitator. For some of the tasks, the facilitator will give instructions, such as “Imagine that you want to change the color of the tool bar up there [point] from black to white. Please show me how you would do that.” For other tasks, students will be instructed to follow the written instructions on the screen or to attend a tutorial. For most tasks, participants will be asked to explain what they are doing and why, as they perform the tasks.

User testing will take no more than 150 minutes per student. Students will be allowed to take breaks as needed. Screen capture software for user testing, such as Morae, may be used as appropriate to document on-screen activity for later analysis. Depending on the needs of the analysis, keyboard logging and clickstream recording may also be performed. Screen activity may be recorded but the participants themselves will not be recorded.

Students’ success or difficulty in completing assigned tasks will be analyzed to determine which information or control elements are missing or insufficient to allow successful completion of anticipated user tasks. While successful completion of tasks will be recorded, it is the tools and item types that are being evaluated rather than the students. All results will be used only to make recommendations regarding the design and development of tools and item types.

Results will be analyzed chiefly in terms of descriptive statistics detailing the distribution of success rates and subjective user ratings. An example finding would be: “40 percent of participants found the volume control without assistance.” This finding would then be used to determine that the volume control needs to be made more visible to users in order to be used successfully by 100 percent of the students. Other statistical comparisons may be performed as appropriate to the variables and populations.

  1. Consultations outside the agency

Westat is the Sampling and Data Collection contractor for NAEP. Westat will provide the tablets for the students’ use and carry out the pretesting study.

ETS is the NAEP contractor responsible for the development and ongoing support of NAEP DBAs for NCES, including the system to be used for the Simulated Classroom Pretesting study. ETS will be onsite to assist Westat in the administration of the study.

ETS serves as the NAEP Platform Development, Planning and Coordination, Item Development, and Design, Analysis, and Reporting contractor for NAEP. ETS staff may assist in administering and/or observing some sessions.

EurekaFacts is located in Rockville, Maryland. It is an established for-profit research and consulting firm, offering facilities, tools, and staff to collect and analyze both qualitative and quantitative data. EurekaFacts is working as a subcontractor for ETS to recruit participants and provide the facilities to be used for the study. In addition, EurekaFacts staff may assist in administering and/or observing some sessions.

ASPIRA, LLC is located in Carolina, Puerto Rico. It is an established not-for-profit education and consulting organization, offering facilities, tools, and staff to collect data. ASPIRA, LLC is working as a subcontractor for ETS to recruit participants and provide the facilities to be used for the study. In addition, ASPIRA, LLC staff may assist in administering and/or observing some sessions.

Pearson is the Materials Preparation, Distribution, Processing and Scoring (MDPS) contractor. Pearson will print student login cards.

  1. Justification for Sensitive Questions

Throughout the item and debriefing question development processes, effort has been made to avoid asking for information that might be considered sensitive or offensive.

  1. Paying Respondents

To encourage participation in the Simulated Classroom Pretesting or Usability Testing events and thank students for their time and effort, gift cards will be obtained from a major credit card company. Each student will receive, a $35 gift card for those participating in the Simulated Classroom and a $25 gift card for those participating in the Usability testing. If a parent or legal guardian brings their child to and from the testing site, he or she will also receive a $25 gift card to thank him/her for the time and effort in transporting their child.

The schools that participate in the Field Trial will each receive a $200 gift card to an office supply store (e.g., Staples or Office Depot) to encourage their participation and to thank them for their time and effort. The study will take place during regular school hours, and thus there will not be any monetary incentive for the student participants, although students will get to keep the NAEP earbuds they will use to participate in eNAEP.

  1. Assurance of Confidentiality

Authorization and Confidentiality Assurance Language

National Center for Education Statistics (NCES) is authorized to conduct NAEP by the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622) and to collect students’ education records from education agencies or institutions for the purposes of evaluating federally supported education programs under the Family Educational Rights and Privacy Act (FERPA, 34 CFR §§ 99.31(a)(3)(iii) and 99.35). The information [you/your child/each student - as applicable] provide[s] will be used for statistical purposes only. In accordance with the Confidential Information Protection provisions of the Foundations of Evidence-Based Policymaking Act of 2018, Title III, Part B, Confidential Information Protection and other applicable Federal laws, [your/your child’s/each student’s - as applicable] responses will be kept confidential and will not be disclosed in identifiable form to anyone other than employees or agents. By law, every NCES employee as well as every NCES agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of $250,000, or both if he or she willfully discloses ANY identifiable information about [you/your child/any student - as applicable]. Electronic submission of [your/your child’s/each student’s - as applicable] information will be monitored for viruses, malware, and other threats by Federal employees and contractors in accordance with the Cybersecurity Enhancement Act of 2015. The collected information will be combined across respondents to produce statistical reports.

Paperwork Reduction Act (PRA) Statement

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0928. The time required to complete this information collection is estimated to average [xx] minutes, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate, suggestions for improving this collection, or any comments or concerns regarding the status of your individual submission, please write to: National Assessment of Educational Progress (NAEP), National Center for Education Statistics (NCES), Potomac Center Plaza, 550 12th St., SW, 4th floor, Washington, DC 20202.


Simulated Classroom and Usability Testing

Before students can participate in the study, written consent will be obtained from the parents or legal guardians of students. Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files, secured for the duration of the study, and will be destroyed after the final report is released. For the Simulated Classroom, the only identification included on the files will be the participant ID. Screen actions with audio may be captured from each session. The only identification included on the files will be the participant ID. The screen capture will be used for analysis after the session. Small portions of the screen capture for select sessions may be used in NCES briefings in order to demonstrate the methodology used for this study. No student PII will be included in data analyses or study briefings. The recorded files will be secured for the duration of the study and will be destroyed after the final report is completed.

Field Trial

As in the national NAEP studies, schools participating in the Field Trials will submit a current roster of students for the sampled grade for student sampling. The student roasters will be created by NAEP Coordinators, NAEP Trial Urban District Assessment (TUDA) Coordinators, or NAEP School Coordinators. The rosters will be submitted through the secure MyNAEP for Schools website. Student PII will be stored in the Sampling Data Collection (SDC) contractor’s secure data environments. At no point in time will any individual contractor have access to both the student name and student assessment and questionnaire responses. The Materials Preparation, Distribution, Processing and Scoring (MDPS) contractor will have access to both the student name and student assessment and questionnaire responses, but never at the same time. The MDPS contractor will use student PII to print Student Login Cards in advance of the Field Trials window and will destroy the student PII file before they begin to receive student assessment and questionnaire responses for processing during the Field Trials window. The SDC contractor will never have access to student responses, and no other contractor will have access to Student PII. The SDC contractor will destroy student names from the data collected for the field trail at the same time as the operational data (approximately 175 days after the conclusion of the operational assessment), thereby making it impossible to link the responses to any directly identifiable PII. For students who will be selected to participate in the Field Trials, their parents or legal guardians will receive a notification letter that cites the authorization and confidentiality assurance language listed earlier in this section.

  1. Estimate of Hourly Burden

Estimated Burden for the Simulated Classroom Pretesting

The estimated burden for recruitment assumes attrition throughout the process. Assumptions for approximate attrition rates for direct participant recruitment from initial contact to follow-up are 75 percent, and from follow-up to confirmation 20 percent.

Table 2. Estimate of Hourly Burden for the Simulated Classroom Pretesting

Respondent

Number of Respondents

Number of Responses

Hours per Response

Total hours

US Mainland

Parent or Legal Guardian for Student Recruitment

Initial contact

1,043

1,043

0.05

52

Follow-up via phone

782*

782

0.15

117

Consent and confirmation

626*

626

0.05

31

Recruitment Total

1,043

2,451


200

Student Participation

Students in Grade 4 – 90min

100

100

1.5

150

Students in Grade 8 – 90min

175

175

1.5

263

Students in Grade 4 – 135min

120

120

2.25

270

Students in Grade 8 – 135min

120

120

2.25

270

Participation Total

515

515


953

Puerto Rico

Parent or Legal Guardian for Student Recruitment

Initial contact

426

426

0.05

21

Follow-up via phone

319*

319

0.15

48

Consent and confirmation

255*

255

0.05

13

Recruitment Total

426

1000


82

Student Participation

Students in Grade 4 – 90min

50

50

1.5

75

Students in Grade 8 – 90min

50

50

1.5

75

Students in Grade 4 – 135min

40

40

2.25

90

Students in Grade 8 – 135min

40

40

2.25

90

Participation Total

180

180


330






Simulated Classroom Pretesting Total Burden US mainland and PR

2,164

4,146


1,565

* Subset of initial contact group Note: numbers have been rounded and therefore may affect totals




Estimated Burden for the Field Trial

The school principal burden is estimated at 20 minutes for initial contact communications. The school coordinator burden is estimated at four hours and 30 minutes for school personnel to complete the coordinator activities in MyNAEP, including looking up information to enter into the system. Additionally, school coordinators may incur up to 100 minutes to enter information on each Student with Disabilities (SD) and English Language Learner (ELL) that is sampled for the study. Furthermore, if e-filing is completed at the school level (estimated to be completed by 9 schools), the school coordinator will incur an estimated 120 more minutes. Parents/legal guardians of participating students will receive a letter explaining the study (Appendix B5), for which the parent/legal guardian’s burden is estimated at three minutes. An additional burden of 15 minutes is estimated for a small portion of parents/legal guardians (up to 2%) who may write to refuse approval for their child or may research information related to the study. Approximately 390 students from 9 schools will participate in the study. Student burden is calculated based on 15 minutes for setup and reviewing the tutorial, 15 minutes to respond to the survey questionnaire, and up to 10 minutes for a group debrief, for a total study session time of 170 minutes. Table 3 details the estimated burden.

Table 3. Estimate of Hourly Burden for the Field Trial

Respondent

Task

Number of Respondents

Number of Responses

Hours per Respondent

Total hours

School principal

Initial contact

9

9

0.33

3

School coordinator

Scheduling and logistics

9

9

4.5

41

e-filing

9*

9

2

18

SD/ELL Information

9*

9

1.7

15

Parents/Legal guardians

Initial notification

390

390

0.05

20

Parents/Legal guardians

Refusals or additional research

8*

8

0.25

2

Students

NAEP 2021 Field Trial (all subjects)

390

390

0.67

261

Field Trial Total Burden

798

824

-

360

* These respondents are duplicative counts and do not represent unique respondents.

Note: numbers have been rounded and therefore may affect totals



Estimated Burden for Usability Testing

The estimated burden for recruitment assumes attrition throughout the process. Assumptions for approximate attrition rates for direct participant recruitment from initial contact to follow-up are 75 percent, and from follow-up to confirmation 20 percent. All usability testing sessions will be scheduled for no more than 60 minutes.











Table 4. Estimate of Hourly Burden for the Usability Testing

Respondent

Number of respondents

Number of responses

Hours per respondent

Total hours

Parent or Legal Guardian for Student Recruitment

Initial contact

448

448

0.05

23

Follow-up contact

112*

112

0.15

17

Consent form completion and return

90*

90

0.13

12

Confirmation

90*

90

0.05

5

Sub-Total

448

740


57

Participation (User Testing)

Students





Grade 4

37 a

37

1

37

Grade 8

37 a

37

1

37

Sub-Total

74

74


74

Total Burden

522

814


131

* These respondents are duplicative counts and do not represent unique respondents.

a Based on previous research, estimated number of actual participants will be less than confirmation numbers.

Note: numbers have been rounded and therefore may affect totals


Table 5 details the total estimated burden for the Simulated Classroom Pretesting, the Field Trial and the Usability sessions.

Table 5. Estimate of Total Hourly Burden

Respondent

Number of Respondents

Number of Responses

Total Hours


Simulated Classroom Pretesting Sub Total

2,164

4,146

1,565

Field Trial Sub Total

798

824

360

Usability Testing Sub Total

522

814

131

Total Burden

3,484

5,784

2,056

Note: numbers have been rounded and therefore may affect totals

  1. Cost to federal government

The total cost of the study is $593,070 as detailed in Table 6.

Table 6. Estimate of Costs

Activity

Provider

Estimated Cost (rounded up)

Recruiting students and providing facilities for the Usability study

EurekaFacts

$135,270

Recruiting students and providing facilities for the Simulated Classroom Pretesting in US

EurekaFacts

$361,137

Administering the Simulated Classroom Pretesting in US

Westat

$26,600

Recruiting students and providing facilities for the Simulated Classroom Pretesting in Puerto Rico

ASPIRA, LLC

$40,563

Administering the Field Trial study

Westat

$29,500

Total


$593,070

  1. Schedule

Table 7. Schedule of activities for NAEP

Activity

Dates

Simulated Classroom Pretesting


Recruitment, pretesting, data collection, analysis and final report

April-December 2020

Field Trials


Recruitment, pretesting, data collection, analysis and final report

April-December 2020

Usability testing


Recruitment, testing, data collection, analysis and final report

April-December 2020





1 More information about NAEP DBA can be found at http://nces.ed.gov/nationsreportcard/dba/default.aspx.

2 For sessions occurring during the summer, students entering or who have completed grades 4 and 8 will be recruited.

3 NCES has conducted usability studies for DBA development over the past few years, as described in OMB# 1850-0803 v.112 and 1850-0803 v.142. This submission is based on the last approved DBA usability test submission (OMB# 1850-0803 v.142).

4 Draft content may be used in the earlier builds.

5 The final items will consist of those selected for NAEP 2019 administration (OMB #1850-0928 v.17). The questionnaire components in Volume II are a subset of the questionnaires provided in these submittals.

6 Draft content may be used in the earlier builds.

7 The final items will consist of those selected for NAEP 2019 administration (OMB #1850-0928 v.10). The questionnaire components in Volume II are a subset of the questionnaires provided in these submittals.

8 Draft content may be used in the earlier builds.

9 Note: If appropriate, relevant appendices (e.g., parental screening calls) may be translated to another language to facilitate communication.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTabletStudyUsability_Vol1_9-10-13
SubjectOperational Analysis
AuthorFulcrum IT
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy