Final-NAGB OMB Section A (3-30-11)

Final-NAGB OMB Section A (3-30-11).doc

Evaluating Student Need for Developmental or Remedial Courses at Postsecondary Education Institutions (Formerly titled Survey of Placement Tests and Cut-Scores in Higher Education Institutions).

OMB: 3098-0006

Document [doc]
Download: doc | pdf

Request for OMB Clearance

Evaluating Student Need for Developmental or Remedial Courses

at Postsecondary Education Institutions

(Formerly titled Survey of Placement Tests and Cut-Scores in Higher Education Institutions).



Supporting Statement for Paperwork Reduction Act Submissions


REVISED 3-30-11



Supporting Statement A: Justification



A.1. Circumstances Necessitating the Information Collection


The congressionally authorized National Assessment of Educational Progress (NAEP) is the only continuing source of comparable national and state data available to the public on the achievement of students at grades 4, 8, and 12 in core subjects. The National Assessment Governing Board oversees and sets policy for NAEP. NAEP and the Governing Board are authorized under the National Assessment of Educational Progress Authorization Act (P.L.107-279).


Among the Board’s responsibilities is “to improve the form, content, use, and reporting of [NAEP results].” Toward this end, the Governing Board established a national commission to make recommendations to improve the assessment and reporting of NAEP at the 12th grade. In its March 2004 report1, the commission noted the importance of maintaining NAEP at the 12th grade as a measure of the “output” of K-12 education in the United States and as an indicator of the nation’s human capital potential. The commission recommended that 12th grade NAEP be redesigned to report on the academic preparedness of 12th grade students in reading and mathematics for entry level college credit coursework. The commission concluded that having such information is essential for the economic well being and security of the United States and, as the only source of nationally representative student achievement data at grade 12, that NAEP is uniquely positioned to provide such information.


As the Governing Board has been developing ways to implement the commission’s recommendations, there has been a wider recognition—among federal and state policymakers, educators, and the business community—of the importance of a rigorous high school program that results in meaningful high school diplomas and prepares students for college and for training for good jobs.


As part of implementing the commission’s recommendations, the Governing Board has planned a program of research studies to support the validity of statements that might be made in NAEP reports about 12th grade student academic preparedness in reading and mathematics for entry-level credit-bearing college courses and for job training. The program of preparedness research for NAEP was developed by a panel of experts in measurement, research, industrial organizational psychology, and postsecondary policy, and adopted by the Governing Board.2


The survey that is the subject of this ICR is one component of this larger program of research.


The program of research consists of five types of studies: (1) content coverage (in which the content of the NAEP 12th grade reading and mathematics assessments is compared with the ACT, SAT, and ACCUPLACER reading and mathematics college admissions/placement tests); (2) statistical relationship (in which students take both NAEP and one of the other admissions/placement tests); (3) standard setting (in which panels of experts identify the skills and knowledge in reading and mathematics on NAEP needed to qualify for entry-level credit-bearing courses without remediation for college and for selected job training programs); (4) benchmarking (in which selected reference groups take NAEP); and (5) the survey of postsecondary institutions’ use of tests and cut-scores for determining student need for remediation—the study that is part of this ICR package.



The Governing Board is requesting clearance for data collection for a study being referred to as a probe. The idea of conducting a probe at this time resulted from consultations with the National Center for Education Statistics in response to the OMB passback of March 3, 2011. The probe sample will include about 1,668 two-year and four-year postsecondary institutions. The results of the probe and the quality of the data produced will be evaluated to determine whether additional changes are needed to the sample and/or the methodology for an operational survey that could be conducted at a later date. The immediate purpose of the study is for internal use by the Governing Board as one component of the Board’s much larger research program.


As a result of the “pilot study” conducted in fall of 2010, changes were made to survey instructions and items (the “pilot study” report was submitted to OMB in January 2010). The term “pilot” was a misnomer. More appropriately, that activity should be referred to as a feasibility study that was aimed at answering a number of specific questions, including:


  • Is the use of separate forms for 2-year and 4-year institutions supported?

  • Should any of the test lists be revised?

  • Are the items being correctly understood?

  • Is the President’s office the appropriate place to identify the survey respondent?

As a result of this feasibility study, the use of separate forms for 2-year and 4-year institutions was supported; the test lists have been revised; minor adjustments have been made to ensure that the survey items are correctly understood; and the President’s office was confirmed as the appropriate place to identify the survey respondent. In addition, data checks have been added to the web-version of the survey to serve as prompts to address missing responses, out-of-range responses, and other potential data issues.


Before proceeding with the probe, a study consisting of cognitive interviews and a usability test will be conducted with nine institutions, spread across different types of public and private 2-year and 4-year institutions. The purpose of the cognitive interviews/usability test study is to determine the effectiveness of the survey changes and the data checks in promoting data quality and in reducing the potential need for follow-up.


An advance letter (see President’s letter, Attachment A-1), rather than the survey packet, will be sent to the office of the institution’s President or Chancellor. The letter will describe the survey, its purpose, and the Federal sponsor (i.e., the National Assessment Governing Board). The letter will request the institution to identify as survey respondent, the individual who is most familiar with the institution’s policy on evaluating entering students’ need for developmental/remedial courses in reading and mathematics. A letter to the respondent (see respondent’s letter, Attachment A-2) will be sent to the designated institutional respondent. The package will contain an enclosure (Attachment A-3) about the Governing Board and will describe how this survey fits into the Board’s overall program of research on the academic preparedness of 12th graders. The cover letter will encourage respondents to complete a web version of the survey, and it will also offer the alternative of completing a traditional paper and pencil questionnaire.


Two slightly different versions of the questionnaire will be used, one for two-year institutions (see Attachment B-1) and the other for four-year institutions (see Attachment B-2). These are the versions of the questionnaire that were revised following the feasibility study and submitted previously. Any additional changes to the questionnaire following the cognitive interviews/usability test are expected to be very minor. The only difference between the two versions of the survey is that two-year institutions will be asked to consider placement policies affecting entering students in programs that are designed to transfer to a four-year institution, while four-year institutions will be asked to consider entering students who are enrolled in an undergraduate degree program in the liberal arts and sciences. The survey will be limited to eight questions. It will be based on information readily available to respondents and can be completed by most individuals in about 30 minutes. The plan is to mail the letters to Presidents in late March/early April 2011.



A.2. Purpose of the Information


The Governing Board will use the information from the probe study of higher education institutions, along with the results of other planned research (described in A.1), to serve as validity evidence, to develop and support valid statements3 that can be made in NAEP reports about 12th grade student academic preparedness for entry-level credit bearing college coursework. To this end, the survey will collect information on tests and test scores used by two-year and four-year postsecondary institutions to identify student need for remediation in mathematics and reading.


Currently, NAEP makes valid statements about what students know and can do in the subjects assessed, but not about the academic preparedness of 12th graders for postsecondary education. Any statements about 12th grade students’ academic preparedness in NAEP reports must be supported by validity evidence. The Governing Board’s program of research will serve as a primary source of such validity evidence. The results of this survey will support inferences about the tests and cut-scores used to make determinations about student need/non-need for remedial/developmental coursework. These results will be analyzed in relation to the results of the other studies (e.g., content comparison, statistical relationship, and judgmental standard-setting). The focus will be on evaluating the logical relationships among the study results and assessing the degree of confirmation/disconfirmation across the various types of studies. The Governing Board believes that the results of the preparedness research program also will be of benefit to the K-12 and postsecondary communities, to inform their efforts in ensuring that our students are well-prepared for college and job training. More information about the research program and study results can be found under the Grade 12 and Preparedness headings at http://www.nagb.gov/publications/reports-papers.htm and http://www.nagb.gov/newsroom/press-releases.htm.


Changes to the survey content and methodology, based on findings from the feasibility study, consultation with the staff of NCES, and the OMB passback, are expected to minimize reporting error and the need for follow-up with respondents. Findings from the small-scale cognitive/usability study will be used to assess the changes to the questionnaire and the data checks, and make any final needed modifications for the probe. These steps will help to ensure the quality of the anticipated probe study data.

We will evaluate the results of the probe and the quality of the data yielded to determine whether additional changes are needed to the sample and/or the methodology for a full-scale operational study that could be conducted at a later date.


Estimates will be provided for the mean, median, and range of the cut-scores in reading and mathematics on the most prevalent national, standardized tests used to determine whether students need remedial courses or can be placed into regular entry-level credit-bearing courses, including especially the ACT, ACCUPLACER, COMPASS, and SAT reading and mathematics tests. The results of the feasibility study, although certainly not conclusive, suggest that these tests are widely used and that the cells for cut-scores will be populated. These are the tests that also are the targets of our content comparison studies and statistical relationship studies.


The data yielded by the survey will provide important information, otherwise unavailable, about the tests and cut-scores employed by postsecondary institutions. However, these data will not be used alone as validity evidence for statements for NAEP reports about 12th grade academic preparedness. Rather, the survey results will be used along with the results of the other studies that comprise the program of preparedness research (e.g., content comparison, statistical relationship, and judgmental standard-setting) as components of an overall framework for such statements. Again, the focus will be on evaluating the logical relationships among the study results and assessing the degree of confirmation/disconfirmation across the various types of studies. The data yielded by the survey are expected to be sufficient for this immediate purpose and will also indicate the extent to which these and other tests are used by postsecondary institutions, important contextual information for our analysis.



A.3. Use of Information Technology


Sampled institutions in the probe study will be encouraged to complete a web version of the questionnaire accessed through the Internet. Institutions will be also given the option of completing the survey using a traditional pencil and paper questionnaire. When paper versions of the questionnaire are used, they will be transmitted to and from respondents by postal mail and fax. In addition, the email address for the contractor (Westat) responsible for answering respondent questions will be included on the front of the questionnaire. Westat will also use mass email reminders to prompt nonrespondents to complete the survey. These procedures are all designed to minimize the burden on respondents. For example, the use of various modes of communication will allow respondents flexibility in completing the survey and obtaining clarification on any data collection issues that may arise.



A.4. Duplication


An extensive review of pertinent literature and on-line resources found no documentation of nationwide studies or other data collection activities to identify the assessments/tests and scores used for determining student need for remedial/developmental mathematics and reading classes in postsecondary institutions.


The review consisted of the following activities under a previous contract with ACT. The Director of ACT’s Information Resource Center conducted a broad, on-line review to identify any source of nationally representative studies or sources that might already collect the information of relevance to this study; she found no such source. ACT uses the Institutional Data Questionnaire (IDQ) to annually collect a wide variety of information about almost all two-year, four-year, and other postsecondary institutions in the United States and in some foreign countries; this information base is routinely updated. The IDQ contains data on the tests/assessments, but not the scores, used by postsecondary institutions for placement in the general subject areas of English, mathematics, reading, and science and the ACT scores used for placement in selected English, mathematics, reading, and science courses. The IDQ data do not include information about placement into remedial courses and the data are not necessarily nationally representative. Therefore, the IDQ is not a source of the comprehensive data needed for this study. Based on these findings, the proposed study will not collect data that duplicates information from any existing source.



A.5. Impact on Small Business


The information collection in the probe study does not affect small businesses or other small entities.



A.6. Consequence if Collection Not Conducted


Information from the probe study is intended as a key component of the Governing Board’s overall program of preparedness research. The results of this survey will support inferences about the tests and cut-scores used to make determinations about student need/non-need for remedial/developmental coursework. These results will be analyzed in relation to the results of the other studies (e.g., content comparison, statistical relationship, and judgmental standard-setting). The focus will be on evaluating the logical relationships among the study results and assessing the degree of confirmation/disconfirmation across the various types of studies.


There is no other source of nationally representative information about the tests and cut-scores used by postsecondary institutions for determining need/non-need for remedial/developmental coursework. As such, it is essential to collect these data.



A.7. Special Circumstances


There are no special circumstances that would cause the information collection in the probe study to be conducted in a manner consistent with any of the instances cited.



A.8. Federal Register Notice and Comments; Efforts to Consult with Persons Outside the Agency


The agency’s original 60-day Federal Register notice seeking public comment on the information collection was published on December 16, 2008 on Page 76350. Two individuals responded. There were no comments received about burden hours.


One commenter wrote as follows: “i [sic] do not think the information for the public is worth this collection effort. it [sic] costs too many tax dollars for non productive information.” No changes were made as a result of this comment. A second commenter made suggestions for minor edits on the proposed survey in its form at that time, which resulted in changes to those items.


In addition to seeking public comment via the Federal Register notice, there are several measures to obtain comments from individuals outside the Agency. To date, survey development on the study has benefited from feedback from potential respondents at postsecondary institutions, input from expert panels, and feedback from a feasibility study of 120 postsecondary institutions.


Pretest of questionnaire: To obtain feedback from potential respondents, a draft questionnaire was pretested with a total of seven institutions. The pretest respondents were asked to review the questionnaire and provide feedback about 1) the clarity of the project’s purpose as described on the instrument; 2) the clarity of instructions; 3) the clarity, quality, and appropriateness of the items to collect the information necessary for the study; 4) an estimation of the time necessary for completing the instrument; and 5) any suggestions for determining the entry-level programs for which respondents will be asked to provide course placement information. In addition, those participating in this review phase were asked to provide any knowledge of other existing sources of the data needed for this study, including the name of the source and, if known, the frequency of collection, reporting format, and data elements recorded. The pretest findings pointed to problems with respondents’ interpretation of some questionnaire items and definitions. Another major finding was that the survey did not provide adequate coverage of the various approaches used by institutions to evaluate student need for remediation.


Feedback from the expert panels: Survey development also benefited from the input of subject-matter experts in two stages. Prior to the pretest, the Governing Board and Westat convened a 1-day meeting with a Technical Review Panel to discuss questionnaire and sampling issues. To further explore questionnaire issues that were revealed during the pretest, feedback was also sought from a panel of content experts on the topic. A total of seven content experts participated in a conference call to provide feedback on two versions of a draft questionnaire. Findings from the discussion confirmed the use of varied and complex approaches to evaluate student preparedness and the potential for further refinement of the questionnaire.


Feasibility study: A feasibility study was conducted with 120 postsecondary institutions to explore questionnaire issues and potential hurdles to full-scale data collection. The findings (described in the previously submitted “pilot test” report) were used to inform changes to the survey instrument and data collection approaches. As a result of this feasibility study, the use of separate forms for 2-year and 4-year institutions was supported; the test lists have been revised; minor adjustments have been made to ensure that the survey items are correctly understood; and the President’s office was confirmed as the appropriate place to identify the survey respondent.


Cognitive interviews/usability test: A study comprised of cognitive interviews and a usability test with nine institutions will be conducted to assess the changes to the questionnaire and the data checks developed for the web version of the survey, and make any final minor modifications, if needed (see description under A.1).



A.9. Payments or Gifts to Respondents


No payment or gift will be provided as incentive to respond to the probe study.



A.10. Assurance of Confidentiality to Respondents


The following statement of data confidentiality will be contained in the cover letters and survey instruments for the probe study:


The information provided by your institution will be kept private to the extent permitted by law.  Data for this study will be reported in aggregate form; the information provided by your institution will be combined with other participating institutions to produce statistical summaries and reports; your institution’s name or individual survey responses will not be reported. 


Westat is an outside agency bringing to the study its recognized reputation as an organization that maintains strict confidentiality of data. The confidentiality statement will be an incentive to participate on the part of potential respondents; there is no agency regulation or policy that requires confidentiality of the test scores used for placement of students in postsecondary education and, in many cases, this information is publicly available.


All Westat staff members working on the study are required to sign Westat’s confidentiality pledge, which appears as Exhibit 1 on the following page.



Exhibit 1. Westat confidentiality statement


WESTAT, INC.

EMPLOYEE OR CONTRACTOR'S ASSURANCE OF CONFIDENTIALITY OF SURVEY DATA


Statement of Policy


Westat is firmly committed to the principle that the confidentiality of individual data obtained through Westat surveys must be protected. This principle holds whether or not any specific guarantee of confidentiality was given at time of interview (or self-response), or whether or not there are specific contractual obligations to the client. When guarantees have been given or contractual obligations regarding confidentiality have been entered into, they may impose additional requirements which are to be adhered to strictly.


Procedures for Maintaining Confidentiality


1 All Westat employees and field workers shall sign this assurance of confidentiality. This assurance may be superseded by another assurance for a particular project.


2. Field workers shall keep completely confidential the names of respondents, all information or opinions collected in the course of interviews, and any information about respondents learned incidentally during field work. Field workers shall exercise reasonable caution to prevent access by others to survey data in their possession.


3. Unless specifically instructed otherwise for a particular project, an employee or field worker, upon encountering a respondent or information pertaining to a respondent that s/he knows personally, shall immediately terminate the activity and contact her/his supervisor for instructions.


4. Survey data containing personal identifiers in Westat offices shall be kept in a locked container or a locked room when not being used each working day in routine survey activities. Reasonable caution shall be exercised in limiting access to survey data to only those persons who are working on the specific project and who have been instructed in the applicable confidentiality requirements for that project.


Where survey data have been determined to be particularly sensitive by the Corporate Officer in charge of the project or the President of Westat, such survey data shall be kept in locked containers or in a locked room except when actually being used and attended by a staff member who has signed this pledge.


5. Ordinarily, serial numbers shall be assigned to respondents prior to creating a machine-processible record and identifiers such as name, address, and Social Security number shall not, ordinarily, be a part of the machine record. When identifiers are part of the machine data record, Westat's Manager of Data Processing shall be responsible for determining adequate confidentiality me assures in consultation with the project director. When a separate file is set up containing identifiers or linkage information which could be used to identify data records, this separate file shall be kept locked up when not actually being used each day in routine survey activities.


6. When records with identifiers are to be transmitted to another party, such as for keypunching or key taping, the other party shall be informed of these procedures and shall sign an Assurance of Confidentiality form.


7. Each project director shall be responsible for ensuring that all personnel and contractors involved in handling survey data on a project are instructed in these procedures throughout the period of survey performance. When there are specific contractual obligations to the client regarding confidentiality, the project director shall develop additional procedures to comply with these obligations and shall instruct field staff, clerical staff, consultants, and any other persons who work on the project in these additional procedures. At the end of the period of survey performance, the project director shall arrange for proper storage or disposition of survey data including any particular contractual requirements for storage or disposition. When required to turn over survey data to our clients, we must provide proper safeguards to ensure confidentiality up to the time of delivery.


8. Project directors shall ensure that survey practices adhere to the provisions of the U.S. Privacy Act of 1974 with regard to surveys of individuals for the Federal Government. Project directors must ensure that procedures are established in each survey to inform each respondent of the authority for the survey, the purpose and use of the survey, the voluntary nature of the survey (where applicable) and the effects on the respondents, if any, of not responding.


PLEDGE

I hereby certify that I have carefully read and will cooperate fully with the above procedures. I will keep completely confidential all information arising from surveys concerning individual respondents to which I gain access. I will not discuss, disclose, disseminate, or provide access to survey data and identifiers except as authorized by Westat. In addition, I will comply with any additional procedures established by Westat for a particular contract. I will devote my best efforts to ensure that there is compliance with the required procedures by personnel whom I supervise. I understand that violation of this pledge is sufficient grounds for disciplinary action, including dismissal. I also understand that violation of the privacy rights of individuals through such unauthorized discussion, disclosure, dissemination, or access may make me subject to criminal or civil penalties. I give my personal pledge that I shall abide by this assurance of confidentiality.


Signature


A.11. Questions of a Sensitive Nature


There are no questions on sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.


A.12. Estimates of the Hour Burden


Estimates of the hour burden for the probe study have been made conservatively, calculating them over the total sample rather than the number of respondents yielded from an estimated response rate of 85%, the minimum overall target response rate for the probe. The estimates take into account the actual burden for the feasibility study as well as changes to the questionnaire and data collection approach made to minimize follow-up for survey response and for missing or suspect data.


In the feasibility study, the average time to complete the questionnaire was 20 minutes. Because the data checks that are added to the web survey will encourage respondents to write in the comment boxes, the burden estimate for completing the questionnaire for the probe is increased to 30 minutes. Estimating 15 minutes on average for all other elements related to the survey (e.g., identifying the appropriate respondent and follow-up calls), the overall average burden is estimated at 45 minutes for the probe.


The total burden estimate, using this conservative approach, is 1,251 hours. We do, however, expect the burden to be substantially less than this estimate.



Table A-1. Estimated burden for the probe


Type of Collection

Sample size

Estimatedburden in minutes per respondent

Respondent burden hours

Institution providing respondent contact information…

1,668

5

139

Respondent completion of questionnaire………………

1,668

30

834

Follow-up with respondents for any reason……

1,668

10

278







Total burden = 1,251 hours






A.13. Total Annual Cost Burden to Respondents


There will be no total annual cost burden to respondents resulting from the collection of information.



A.14. Annualized Costs to the Federal Government


The survey will be conducted under a contract that has already been awarded. The total estimated cost of this project is $562,625. The contract budget is based on personnel hours, printing, mailing expenses, and computer support and analysis.



A.15. Reasons for Program Changes


The increase in burden is due to a need for information, otherwise unavailable, that is associated with the agency mission of improving the form, content, use, and reporting of results reported to the public by the National Assessment of Educational Progress. The information to be collected will help validate statements in NAEP reports about the academic preparedness of 12th grade students for postsecondary education.


A.16. Plans for Publication


The results from the probe study will be used along with other research to support statements about 12th grade student academic preparedness for postsecondary education to be made in NAEP reports. Survey responses will be weighted to produce national estimates. Most of the analyses of the questionnaire data will be descriptive in nature, providing the Governing Board with estimate and standard error tables.


The tables for the probe study will be similar to those presented in the feasibility study report. Tabulations will be produced for each data item. Crosstabulations and means of data items will be made with selected classification variables. These include institutional characteristics, such as the following.


  • Institution level;

  • Institution control; and

  • Selectivity


A.17. Approval to Not Display the Expiration date of OMB Approval


Such approval is not being sought.


A.18. Exceptions to the Certification Statement


There are no exceptions to the certification statement.

1 12th Grade Student Achievement in America: A New Vision for NAEP; www.nagb.gov/publications/12_gr_commission_rpt.pdf

2 Making New Links: 12th Grade and Beyond; www.nagb.gov/publications/PreparednessFinalReport.pdf

3 For more background, see On NAEP and 12th Grade Preparedness: Discussion Draft: Hypothetical Statements for NAEP Reports http://www.nagb.gov/publications/discussion_naep_rpt2009.pdf

10


File Typeapplication/msword
File TitleSupporting Statement: Survey of Placement Tests and Cut-Scores in Higher Education Institutions
AuthorRay.Fields
Last Modified ByAuthorised User
File Modified2011-03-30
File Created2011-03-30

© 2024 OMB.report | Privacy Policy