Response to OMB passback

1850-0790 response to OMB passbackNAEP System Clearance 2014-2016.docx

National Assessment of Education Progress (NAEP) 2014-2016 System Clearance

Response to OMB passback

OMB: 1850-0790

Document [docx]
Download: docx | pdf

MEMORANDUM OMB # 1850-0790 v.36


DATE: March 8, 2013


TO: Shelly Martinez

Office of Information and Regulatory Affairs, Office of Management and Budget


FROM: Patricia Etienne

National Center for Education Statistics

THROUGH: Kashka Kubzdela

National Center for Education Statistics

SUBJECT: Response to OMB Passback for 2014-2016 NAEP System Clearance Submittal



1. SS A2 says that some special studies are released to the public. Please provide the criteria by which NCES makes that determination.

The NAEP special studies vary in terms of the scope, size, purpose, and nature of the study. Studies that focus on a special population (such as the National Indian Education Study) and studies that provide additional information which is supplemental to the Main NAEP results (such as the High School Transcript Study and the 2011 NAEP-TIMSS Linking Study) are released to the public. In addition, NCES also performs investigatory research and studies to try-out new item types, assessment timing, administration mode, tools, etc. While results from these investigatory studies are not released to the public as NAEP reports, the lessons learned from the studies may be disseminated in research reports, conference presentations, or other similar publications.


2. SS A10 – why do schools need to keep materials until the end of the year?  Why not destroy sooner and have the NAEP coordinators oversee for example?

After the assessment, the school storage envelope is the only place where names of the sampled students are kept.  During the data editing and weighting phase, which lasts into the late spring, questions regarding student characteristics and demographic information sometimes arise. In such cases, we may ask the school to look in the school storage envelope to clarify the characteristics of the students in question.   Without this link back to the student names, we would be unable to clarify questions that may impact the data editing and weighting.  


And how does NCES confirm that materials are destroyed? 

The NAEP staff leaves a postage-paid postcard attached to the school storage envelope, instructing the school coordinator to destroy the contents of the envelope and then mail the postcard. Upon receipt, the postcard is filed in the school’s NAEP folder. Currently, NAEP field staff does not perform on-going follow up to confirm the destruction of these materials, but this current practice is being reviewed.

3. The estimated burden for 2015 is about 40% higher than that in the 2013 sampling plan provided in appendix C. Please explain this burden increase.

The burden estimate for 2015 is approximately 40% higher than that of 2013 because 2015 includes additional subjects and additional populations. In addition to reading and math, 2015 includes science at all three grades (an additional 294K students), 20K additional students for the National Indian Education Study (which was not conducted in 2013), and a writing pilot for an additional 12K students. The total additional student population for these components is approximately 326K, which accounts for the 40% increase over 2013.



4.       Please provide precision requirements that undergird the sampling plan.  They are not explicit in Appendix C.

For the national-only assessments, the sample size requirements are driven by the minimum sample required for the item response theory estimation. Specifically, 2,000 student responses are needed per item. The sample size of 2,000 was determined early on in the assessment program based on experience with other IRT-based assessments and validated in the mid-2000s by Moran (2007). This study showed that, while for multiple-choice items smaller samples could yield robust item parameters, for longer constructed-response items, a sample size of 2,000 was necessary to ensure that valid and reliable location parameters could be obtained for partial credit items. The total number of items and blocks of items, in conjunction with the overall booklet design, determines the resulting overall sample size for the assessment. Unless there are changes in the number of blocks assessed from year to year, overall sample sizes are relatively stable over time.


For the combined state and national assessment subjects where state-level results are reported, approximately 2,700 students are sampled from within each state (exclusive of the urban district samples). This target sample size for states was established based on the desire to provide precision for state-level results that was comparable to the regional results from the previous national NAEP assessments. While fewer students could be sampled without affecting the precision of the overall state or national estimates, we have maintained these sample sizes over time so that the trend results have similar effect sizes and standard errors.  As such, the interpretations of the data have not changed.  For example, the results that support comparison statements (e.g., group X significantly increased between year Y1 and year Y2) are based on similar findings in terms of effect size.  In addition, largely the same student groups can be compared across years. If, on the other hand, the size of a particular subgroup (for example the group “ELL students in Urban Locations in PA”) decreased significantly from one assessment year to the next, this group may not reach the reporting threshold in terms of sample size in the second year, and as such, could not be compared or even reported on.



5.       Is feedback provided to NCES managers of CCD and PSS when NAEP state coordinators identify new schools to add to the sampling frame?


The vast majority of schools that are added to the NAEP sampling frame are the result of changes that have occurred since the period for which the most recent CCD and PSS files refer. Such changes would have already been reported to CCD and PSS and, therefore, processing for incorporation into the next generation of CCD and PSS files would be underway. Thus, in the overwhelming majority of cases, the CCD and PSS will already have been informed of the changes. Nevertheless, beginning with the 2013 NAEP new school procedure, the Sampling and Data Collection contractor will routinely forward to the CCD Program Director a file of all public school information gathered during the new school procedure. This is an EXCEL file, containing information about not only schools not found on the most current CCD file, but also information on name, address, and grade span changes for schools that are on the file. We will also provide such a file for PSS, relating to Catholic schools, should that be requested.


In addition, in certain cases, we learn of ongoing issues that are evidently not simply a matter of recent changes, but indicate something that should have been represented on the version of CCD or PSS that was used as the NAEP sampling frame. In those cases, we do notify NCES CCD and PSS Program Directors of what we have noted, and then forward additional information to them as requested, in whatever format they prefer. For example, in one instance several years ago, we noted that the grade span values for a large proportion of the schools in one state were in error. We informed the CCD Program Manager and provided summary information showing the many changes from the prior CCD file. As another example, some years ago, we pointed out to the PSS Program Manager that the PSS file would be enhanced by indicating for Catholic schools which diocese the school is associated with, and the PSS file now routinely includes those data.



6.       How often do schools opt to administer NAEP to all students in a grade versus only those sampled?  Why?  Are those cases included in the burden estimate?

Occasionally schools do ask that NAEP be administered to all students in a school, rather than just those sampled. Most of the requests are based upon the school’s desire to minimize disruption to the school day and students. When a school requests that we assess more students than were sampled, the individual request is reviewed, along with an estimate of the additional NAEP resources required to fill the school’s request. We then make a determination as to whether to allow the extra administration, based on considerations of burden and cost.


Please note that if the desired sample size is more than 90 percent of the enrollment, NCES automatically selects all students, as we realize that omitting such a small proportion of students is not only burdensome to the school, but may make the omitted students feel stigmatized. Our burden estimates do take into account the small number of schools that include all students.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorjoc
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy