Justification Memo

NAEP 1st-Year Texas Postsecondary Students 2010 Pilot Test Memo.docx

System Clearance for Cognitive, Pilot and Field Test Studies

Justification Memo

OMB: 1850-0803

Document [docx]
Download: docx | pdf




National Assessment of Educational Progress





Supporting Statement




NAEP Study of First-Year Texas Postsecondary Students 2010 Pilot Test


Shape1







7/14/10

OMB# 1850-0803 v.33



Supporting Statement



Appendix C: NAEP Study of First-Year Texas Postsecondary Students 2010 Pilot Test Questionnaire

Appendix D: Disclosure Notice

Appendix E: Template of Recruitment Letter to Selected Students

Appendix F: Script for Scheduling Call to Sampled Students

  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement for cognitive labs and pilot tests (OMB #1850-0803 v.33). This generic clearance provides for NCES to conduct various procedures (such as field tests and cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.



  1. Background and Study Rationale

The congressionally authorized National Assessment of Educational Progress (NAEP) is the only continuing source of comparable national and state data available to the public on the achievement of students at grades 4, 8, and 12 in core subjects. The National Assessment Governing Board (Governing Board) oversees and sets policy for NAEP. NAEP and the Governing Board are authorized under the National Assessment of Educational Progress Authorization Act (P.L.107-279).


Among the Governing Board’s responsibilities is “to improve the form, content, use, and reporting of [NAEP results].” Toward this end, the Governing Board established a national commission to make recommendations to improve the assessment and reporting of NAEP at the 12th grade. In its March 2004 report1, the commission noted the importance of maintaining NAEP at the 12th grade as a measure of the “output” of K-12 education in the United States and as an indicator of the nation’s human capital potential. The commission recommended that 12th grade NAEP be redesigned to report on the academic preparedness of 12th grade students in reading and mathematics for entry level college credit coursework. The commission concluded that having this information is essential for the economic well being and security of the United States and that NAEP is uniquely positioned to provide such information.


As the Governing Board has been developing ways to implement the commission’s recommendations, there has been a wider recognition—among federal and state policymakers, educators, and the business community—of the importance of a rigorous high school program that results in meaningful high school diplomas and prepares students for college and for job training. The Administration has set the goal of ensuring that every high school graduate is college and career ready. Enabling NAEP to report on 12th grade preparedness will provide an indicator that can be used to monitor this goal.


As part of implementing the commission’s recommendations, the Governing Board has planned a program of research studies to support the validity of statements about 12th grade student preparedness in reading and mathematics. An overview of these research studies is provided in Appendix A2. Among the studies planned is a proposed study of first-year postsecondary student performance on the NAEP mathematics and reading assessments. The data resulting from this study will be used, along with the results of the other planned studies, to help develop valid statements that can be made about the preparedness of 12th grade students in NAEP reports. While other studies, such as NCES high school longitudinal studies (e.g. NELS, ELS, HSLS), provide information relating achievement on assessments and high school grades (assessments developed specifically for the study as well as AP Exams, ACT, and SAT scores) to college placement and success, the study in this request would provide valuable empirical linkages between NAEP achievement and college placement, which has not previously been examined.


To enable more rapid organization and execution of the study, the Commissioner of the Texas Higher Education Coordinating Board (THECB) has offered to assist in conducting this study at public colleges and universities in Texas. Overall, the project involves three progressive phases of research, as follows:


Phase 1: Exploratory telephone discussions with designated contact persons at nine Texas colleges and universities identified by the Commissioner of the THECB. These institutions were purposively selected by the Commissioner to represent a diverse mix of colleges based on two-year and four-year degree status, size (total enrollment), region of the state, race/ethnicity composition of students, and selectivity. Results from the telephone interviews are presented below, in section 3 on the study design and context.


Phase 2: A small-scale pilot study of the procedures for and feasibility of administering the NAEP assessments on campus (described in this submission).


Phase 3: If the phase 2 pilot is deemed successful by the Governing Board and NCES, data collection could be expanded to a full-scale, more representative sample of postsecondary institutions randomly selected statewide in Texas for the fall 2011 semester. The larger, more representative sample would allow the full set of NAEP analyses to be completed to obtain performance results on the mathematics and reading scales, linking NAEP performance in Texas to college placement.


This submission describes the 2010 pilot test (phase 2) for the study of first-year postsecondary student performance on NAEP. The pilot study involves administering assessments to a sample of 600 incoming first-year students at nine volunteer Texas colleges and universities. The main focus for the study will be on the operational experience of conducting NAEP assessments in the postsecondary setting, and it will evaluate aspects such as success with sampling, data collection, and response rates. The planned pilot is expected to provide important insights about the operational feasibility of conducting NAEP assessments with first-year postsecondary students, including the willingness of students to participate and the availability of records for creating sampling frames and collecting academic data on sampled students. Thus, the pilot will inform whether standard NAEP data collection methods can support sufficient student response rates and whether student placement records can be obtained reliably.



  1. Study Design and Context

Research Questions

The main research questions being investigated for the pilot study are the following:

  • Based on the student participation in the pilot study, is a full-scale study feasible?

  • What were the practical aspects of working with participating postsecondary institutions?

  • How did the sampling and data collection activities vary across institutions?


Prior to performing a full-scale study of the relationship between NAEP performance and placement in Texas postsecondary institutions, it is imperative that a pilot study be conducted to examine practical aspects of the planning and administration process, operational and logistical issues involved in the data collection, student response rates to the assessment, and the ability of the participating institutions to provide requested academic and demographic data for the study.


For this pilot study, students will be given two blocks of reading or mathematics cognitive questions followed by a short student survey consisting of four questions. The blocks of reading and mathematics cognitive questions are intact and unchanged from those that were administered to students across the nation at grade 12 in the winter of 2009, as part of the main NAEP assessment. The four student survey questions, about student’s race/ethnicity and parents’ highest education level (listed in Appendix C), are a subset of the survey questions that were administered at grade 12 as part of the main NAEP assessment in 2009.


Given that a primary goal of the larger study is to determine how first-year postsecondary students perform on the 12th-grade NAEP reading and mathematics assessments, the standard NAEP 12th grade administration procedures will be followed as closely as possible in the postsecondary administration sessions. This means using: the same assessment instruments with the fewest possible changes, trained NAEP supervisors (SVs) and Assessment Administrators (AAs), the same administration procedures and instructions for students, the same time limits for the assessment sessions, similar physical settings for the assessments, and so on. In order for the assessment results to be comparable across the 12th grade and first-year postsecondary student samples, the administration methods will be as identical as possible.


However, this same condition cannot apply to the tasks of sampling and recruiting students. There are many important differences between the high school and postsecondary settings that impact how the data collection effort will be conducted at the postsecondary level, such as the following:

  • Attendance Schedules: For high school students, daily attendance is generally mandatory and most students are at school at the same times Monday through Friday. First-year postsecondary students are not legally required to attend classes, have highly variable schedules, and may be on campus fewer than five days per week.

  • Proximity to Assessment Site: Colleges and universities typically cover larger areas than high schools, with buildings and classrooms often far away from parking lots and public transportation. This may pose a greater challenge for postsecondary students in terms of getting to the assessment session, as compared to high school students.

  • Living Arrangements: The vast majority of high school seniors live at home with parents, while first-year postsecondary students may live on campus, at home with parents, or in private apartments and houses off-campus.


While some aspects of data collection will be different than the standard NAEP high school activities, findings from phase 1 of this study indicate that participating institutions are willing to provide information and assistance during data collection. Westat project staff completed phase 1 exploratory telephone interviews with senior administration representatives at all nine participating Texas colleges and universities during the week of June 2125, 2010. All of the contacts indicated top-level support for the pilot from senior management and a strong interest in research projects addressing academic preparedness among first-year students. The interview results were both consistently positive and notably uniform with respect to the feasibility of conducting the NAEP assessments on campus at these institutions.


When interviewed about the NAEP pilot sampling and data collection plans, contact persons at all nine participating postsecondary institutions confirmed the following:


  • Lists of first-year students covering virtually all of the eligible students3 will be available for selecting a random, representative sample prior to the planned start of data collection in September.

  • Sampling lists can identify students enrolled in any developmental/remedial classes.

  • The planned data collection window of September through October, 2010, was approved by all contacts.

  • Westat field staff may review student lists for sampling purposes, help conduct recruitment of sampled students, and administer the NAEP assessments on campus.

  • Colleges and universities can assist in student recruitment and will provide access to appropriate locations on campus to conduct the assessments.

  • All of the contacts confirmed that student academic records and demographic data can be provided consistent with the requirements of the Family Educational Rights and Privacy Act (FERPA) (20 U.S.C. 1232g). The required FERPA disclosure notice regarding release of academic records by the participating institutions will be provided through the Disclosure Notice template shown in Appendix D. (Note that the collection of records-based information in this study is authorized by the Education Sciences Reform Act of 2002, P.L. 107-279.) We will provide the Disclosure Notice to students at the time of the NAEP assessment sessions as well as to the registrar’s office and other appropriate contacts providing data on individual students.

  • All of the institutions agreed to assist Westat in understanding campus IRB requirements and the timeline for submitting all necessary research application forms.


The interviews did not uncover any specific problems or concerns regarding pilot data collection this fall. The issues likely to require the most attention going forward are the IRB procedures and requirements on each campus and the efforts to maximize student response rates.


Westat project staff will conduct periodic telephone planning discussions with all of the contacts throughout July and August to focus on these issues, and will also visit at least two of the campuses to verify information collected during the interviews. This will include in-person inspections of proposed assessment sites, campus layout and conditions, student enrollment lists and telephone directories, campus e-mail systems and bulletin boards, and other aspects of the campus infrastructure that could impact pilot data collection outcomes.



  1. Data Collection Process

Qualified staff will administer the assessment using procedures similar to those used for regular NAEP administrations at grade 12. Information on recruitment, participant characteristics, sample design, and data analysis are provided below.


Recruitment

NAEP staff will draw a random sample of students from student lists provided by the institutions. The sampled students will then be sent a letter from the institution and asked to participate in the study. The letter will explain that student participation is voluntary. Appendix E presents a template of the letter that participating institutions will send to students informing them about the study and the confidentiality of the data collection and Appendix F presents a template of the script to be used for the follow-up phone call. Westat will work closely with the participating institutions to ensure that the letters they send to students include the appropriate wording stating the voluntary nature of the study and describing the extent to which the answers and all personally identifying information are confidential.


Participant Characteristics

The sampling target is 600 completed assessments from eligible students. The sample frame for students uses the criteria that students must

  • be first-year postsecondary students 18 years old or older,

  • have completed high school anywhere in the U.S. in the spring of 2010, and

  • comprise a 50:50 male to female ratio.


A key objective of phase 2 is to discover what response rate can be achieved within the study protocol. To determine the starting sample size needed to achieve the completed sample target, it is necessary to estimate a response rate in advance. Based on a review of the literature and prior experience in conducting surveys with this age group, we estimate that approximately 45 percent of sampled students will participate in the study.4 Thus, approximately 1350 students will be recruited to obtain a sample size of 600 students. The actual response rate may be higher or lower. We will work closely with the participating colleges and universities to maximize response rates through the use of advance recruitment letters, flexible scheduling of assessment sessions, convenient assessment locations on campus, make-up sessions, involvement/support of postsecondary institution faculty and staff, and other strategies designed to increase student cooperation. The random starting samples of first-year students obtained at each institution will ensure that a representative sample is selected. In addition, background data provided by all of the sampled students will allow the examination of non-response rates among as many student characteristics as possible.


Sample Design

The targeted final sample size of approximately 300 students per assessment subject (reading or mathematics) will allow for approximately 100 students to answer each of the cognitive questions in the reading or mathematics booklets. The number of students for each of the six assessment booklets, and the composition of those booklets, is shown in the table below.



Booklet

Number of Students5

Cognitive Block 1

Cognitive Block 2

Survey Questions

Booklet 1

100

Reading Block A

Reading Block B

4 questions

Booklet 2

100

Reading Block C

Reading Block D

4 questions

Booklet 3

100

Reading Block E

Reading Block F

4 questions

Booklet 4

100

Mathematics Block A

Mathematics Block B

4 questions

Booklet 5

100

Mathematics Block C

Mathematics Block D

4 questions

Booklet 6

100

Mathematics Block E

Mathematics Block F

4 questions


Data Analysis

After data collection, the test booklets will be processed and scored. Once the student data are processed, files are sent for data analysis. Data analyses will consist of background item frequency distributions, including response rates and selected cross-tabulations of key variables. In addition, classical item analyses of cognitive items by block and overall, including response rates, will be performed. The small samples will not permit estimation of plausible values or evaluation of item-level models. A high-level summary memo of the analysis results will be developed.


Although the purpose of the phase 2 pilot is not to evaluate first-year postsecondary students’ performance on the 12th grade NAEP assessments, analyzing the data may provide information that can be brought to bear on decisions regarding the feasibility or design of the phase 3 study. For instance, item response rates will provide a preliminary sense of first-year postsecondary students’ level of engagement on the NAEP assessments. Performing item analyses on the data will also provide some indication, albeit limited, of first-year postsecondary students’ performance on the items in the 12th grade NAEP assessments.



  1. Consultants Outside the Agency

NCES staff, Westat, Educational Testing Service (ETS), and Pearson Educational Measurement have all contributed to the design of the study. In addition, the NAEP Education Statistics Services Institute (NAEP ESSI) created a literature review detailing how other studies have been conducted at the postsecondary level. Representatives from the nine volunteer institutions will also be involved in gathering student sample lists and in recruiting students.


An expert panel will be convened to evaluate the findings of the phase 2 pilot study and provide recommendations for the phase 3 full-scale study. Prospective panel members have been selected on the basis of expertise in the field of postsecondary research and assessment. The names and affiliations of the prospective panel members are provided in Appendix B. As indicated in Appendix B, some of the prospective panel members also participated in an April 2010 planning meeting about the overall purpose, goals, and general plan for the study.



  1. Assurance of Confidentiality

NCES has policies and procedures that ensure privacy, security, and confidentiality, in compliance with the legislation (Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and Education Sciences Reform Act (Public Law 107-110, 20 U.S.C. §9622)). Specifically for the NAEP project, this ensures that privacy, security, and confidentiality policies and procedures are in compliance with the Privacy Act of 1974 and its amendments, NCES Confidentiality Procedures, and the Department of Education ADP Security Manual. The federal authority mandating NAEP in Section 9622 of US Code 20 requires the confidentiality of personally identifiable information, as follows:


(A) IN GENERAL.-- The Commissioner for Education Statistics shall ensure that all personally identifiable information about students, their academic achievement, and their families, and that information with respect to individual schools, remains confidential, in accordance with section 552a of title 5.


(B) PROHIBITION.-- The Assessment Board, the Commissioner for Education Statistics, and any contractor or subcontractor shall not maintain any system of records containing a student's name, birth information, Social Security number, or parents' name or names, or any other personally identifiable information.


Participation is voluntary and personally identifiable information will not be maintained for the student participants. Participants will be provided with the following confidentiality pledge:


The information you provide will be used for statistical purposes only. In accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107–347 and other applicable Federal laws, your responses will be kept confidential and will not be disclosed in identifiable form to anyone other than employees or agents. By law, every NCES employee as well as every agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of up to $250,000, or both if he or she willfully discloses ANY identifiable information about you.


  1. Justification for Sensitive Questions

No sensitive questions will be asked.


  1. Estimate of Hour Burden

The total assessment time for each student will not exceed 75 minutes, and we anticipate that each student will only require 60 minutes. In some instances, assigning the student a test booklet and acquiring adequate space for testing may require additional time, but would not exceed 15 minutes. The actual assessment time is 50 minutes for answering cognitive items, and approximately 5–10 minutes for answering 4 survey questions. Only the burden associated with answering the 4 survey questions is subject to clearance. The burden for student recruitment provided below includes the burden for all students who are recruited to participate in the study; the burden for the survey includes only those students who elect to participate.


Respondent

Hours per Respondent

Number of Respondents

Approximate Total Hours

Student—Recruitment

0.083

1350

112

Student—Survey Response

0.166

600

100

Total Student Burden Hours

212



The school burden time is estimated to be up to 40 hours per school, which may be distributed across several staff members at each school. The school burden estimate includes staff time for assisting with recruitment.



Hours per School

Number of Schools

Total Hours

Total School Burden Hours

40

9

360



  1. Estimate of Costs for Recruiting Respondents

Students will be actively recruited to take part in this study but will not be offered monetary incentives. This will help to keep the pilot administration at the postsecondary level as close as possible to the grade 12 NAEP administrations, because no monetary incentives are provided to students currently sampled for NAEP.


  1. Cost to the Federal Government

The following table provides the overall project cost estimates:

Activity

Provider

Estimated Cost

Coordination, project management, expert panel meeting; data analysis and assessment design; test booklet creation, scoring support

Educational Testing Service (ETS)

$172,000

Sampling, data collection, weighting

Westat

$317,000

Printing, shipping, processing, scoring

Pearson Educational Measurement

$300,000

Totals


$789,000



  1. Study Schedule

The following table provides the overall schedule for the study:

Activity

Tasks

Date Ranges

Prepare study design

Review by NCES

April - July, 2010

OMB clearance

OMB submission

July 2010

Prepare test booklets

Begin printing

August 4, 2010

 

Materials shipped

August 27, 2010

Data collection

Recruit participants

August–September, 2010

 

Data collection

September–October, 2010

Data processing

Scan books

October, 2010


Score student responses

November, 2010

Data analysis

Analysis of data

December, 2010

 

Final study data sent to NCES

January, 2011



Appendix A: Recommended Studies for 2009 NAEP 12th Grade Preparedness Reporting6


RECOMMENDED STUDIES FOR 2009 NAEP 12TH GRADE PREPAREDNESS REPORTING

(A) Content Alignment Studies for NAEP and Assessments of Postsecondary Preparedness

• Comparison with college admissions and placement examinations (ACCUPLACER, ACT, ASSET, COMPASS, SAT)

• Comparison with workplace eligibility and placement examinations (WorkKeys and ASVAB)


(B) Statistical Relationship Studies for NAEP and Assessments of Postsecondary Preparedness7

• Linking national NAEP scores with preparedness indicator scores from other assessments

• Linking 12th grade NAEP performance with longitudinal databases (score data for college admission and course placement; transcript data; and workplace data)


(C) Judgmental Studies to Set NAEP Cut Scores for Workplace Preparedness (Military and Civilian)

• Identification of five to seven target occupations across various sectors

• Identification and development of eligibility criteria for each target occupation’s job training programs

• Setting NAEP reading and mathematics job training program cut scores


(D) Judgmental Studies to Set NAEP Cut Scores for College Preparedness

• Setting NAEP reading and mathematics college preparedness cut scores using:

- ACT College Readiness Standards

- College Board Standards for College Success

- Standards developed by subject matter experts specializing in college course placement


(E) National Survey of College Course Placement Assessments and Cut Scores




Appendix B: Prospective Expert Panel



Wayne Camara

Vice President, Research and Development, the College Board


Barbara Dodd

Professor, University of Texas at Austin


Dary Erwin

Associate Provost and Professor, James Madison University


David Gardner*

Deputy Commissioner for Academic Planning and Policy, Texas Higher Education Coordinating Board


Geraldine Mooney*

Vice President, Surveys and Information Services, Mathematica


Maria Teresa Tatto*

Associate Professor, Michigan State University


Jennifer Sharp Wine*

Project Director, RTI International



*Participated in April 2010 meeting about the overall purpose, goals, and general plans for the study.


1 See http://www.nagb.org/publications/12_gr_commission_rpt.pdf.

2 The full scope of the Governing Board’s research agenda can be found on the Governing Board’s website at http://www.nagb.org/publications/PreparednessFinalReport.pdf.


3 Because colleges finalize their enrollment during the last few weeks of summer, and some students may drop out or be added, the lists are expected to be complete enough for sampling, but not yet 100% complete.

4 For instance, the National Survey of Student Engagement has an average response rate of 40 percent, which varies across years and institutions. See http://nsse.iub.edu/NSSE_2009_Results/pdf/NSSE_AR_2009.pdf.

5 These numbers are approximate and will depend on the student response rate.

6 This appendix is reproduced from the National Assessment Governing Board (2009). Making New Links, 12th Grade and Beyond: Technical Panel on 12th Grade Preparedness Research Final Report, Appendix E. http://www.nagb.org/publications/PreparednessFinalReport.pdf

7 The study described in this submission is a statistical relationship study.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleApril 2 DRAFT
AuthorBlairJ
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy