Justification Section A

NAGB OMB Section A (6-17-10) (2).doc

Evaluating Student Need for Developmental or Remedial Courses at Postsecondary Education Institutions (Formerly titled Survey of Placement Tests and Cut-Scores in Higher Education Institutions).

Justification Section A

OMB: 3098-0006

Document [doc]
Download: doc | pdf

Request for OMB Clearance

Evaluating Student Need for Developmental or Remedial Courses

at Postsecondary Education Institutions

(Formerly titled Survey of Placement Tests and Cut-Scores in Higher Education Institutions).



Supporting Statement for Paperwork Reduction Act Submissions



Supporting Statement A: Justification



A.1. Circumstances Necessitating the Information Collection


The congressionally authorized National Assessment of Educational Progress (NAEP) is the only continuing source of comparable national and state data available to the public on the achievement of students at grades 4, 8, and 12 in core subjects. The National Assessment Governing Board oversees and sets policy for NAEP. NAEP and the Governing Board are authorized under the National Assessment of Educational Progress Authorization Act (P.L.107-279).


Among the Board’s responsibilities is “to improve the form, content, use, and reporting of [NAEP results].” Toward this end, the Governing Board established a national commission to make recommendations to improve the assessment and reporting of NAEP at the 12th grade. In its March 2004 report, the commission noted the importance of maintaining NAEP at the 12th grade as a measure of the “output” of K-12 education in the United States and as an indicator of the nation’s human capital potential. The commission recommended that 12th grade NAEP be redesigned to report on the academic preparedness of 12th grade students in reading and mathematics for entry level college credit coursework. The commission concluded that having such information is essential for the economic well being and security of the United States and that NAEP is uniquely positioned to provide such information.


As the Governing Board has been developing ways to implement the commission’s recommendations, there has been a wider recognition—among federal and state policymakers, educators, and the business community—of the importance of a rigorous high school program that results in meaningful high school diplomas and prepares students for college and for training for good jobs.


As part of implementing the commission’s recommendations, the Governing Board has planned a program of research studies to support the validity of statements about 12th grade student preparedness in reading and mathematics. Among the studies planned is a proposed survey of two-year and four-year postsecondary institutions about the use of tests and test scores for placing first-year students into entry-level credit bearing courses and into remedial/developmental courses in mathematics and reading. The data resulting from this survey will be used, along with the results of the other planned studies, to help develop valid statements that can be made about the preparedness of 12th grade students in NAEP reports.


Exploratory survey development work with expert panels and the results from a pretest of the survey instrument indicate that approaches used by postsecondary institutions to evaluate student preparedness are complex and varied, especially with regards to the use of various tests in combination with other evaluation criteria. Thus, the Governing Board is requesting clearance for (a) a pilot test consisting of a survey to 120 sampled two-year and four-year postsecondary institutions and (b) a full-scale survey to about 1,700 sampled two-year and four-year postsecondary institutions.


The purpose of the pilot test will be to identify potential problems that may emerge from data collection during the full-scale survey. In addition, the pilot test will provide insights into the extent to which institutions use various tests, either independently or in combination with other evaluation criteria to identify student need for remediation in mathematics and reading. Two slightly different versions of the questionnaire will be tested, one for two-year institutions (see Appendix A-1) and the other for four-year institutions (see Appendix A-2). The only difference between the two versions of the survey is that two-year institutions will be asked to consider placement polities affecting entering students in programs that are designed to transfer to a four-year institution, while four-year institutions will be asked to consider entering students who are enrolled in an undergraduate degree program in the liberal arts and sciences. The survey will be limited to four pages of questions. It will be based on information readily available to respondents and can be completed by most individuals in about 45 minutes. The cover letter (see Appendix B) will include information about the option to complete a web version of the survey. The pilot test survey will be mailed to institutions in July 2010. The full-scale survey will be mailed to institutions in November 2010.


Both the pilot test and the full-scale survey will be a self-administered survey addressed to the President of the institution with a cover letter requesting that the survey be completed by the appropriate individual or office. Respondents will have the option of completing a web version that will be accessed through the Internet or a traditional paper and pencil questionnaire. Westat has been contracted by the Governing Board to conduct the pilot test and the full-scale survey.



A.2. Purpose of the Information


Information from the pilot test will be used to design a questionnaire that is clearly worded, minimizes respondent burden, and provides adequate coverage of various types of tests and test scores used by institutions to determine student need for remediation in mathematics and reading (see details in section A.1). The pilot test will allow for a rigorous testing of questionnaire issues because it will mirror the data collection procedures for the main study. A careful analysis of these issues will help to avoid potential problems in the main data collection. The pilot study will also provide insights into the extent to which institutions use various tests, either independently or in combination with other evaluation criteria to identify student need for remediation in mathematics and reading.


The Governing Board will use the information from the full-scale survey of higher education institutions, along with the results of other planned research, to serve as validity evidence to develop and support statements to be made in NAEP reports about 12th grade student preparedness. To this end, the survey will collect information on tests and test scores used by two-year and four-year postsecondary institutions to identify student need for remediation in mathematics and reading.



A.3. Use of Information Technology


Sampled institutions in both the pilot test and full-scale survey will be encouraged to complete a web version of the questionnaire accessed through the Internet. Institutions will be also given the option of completing the survey using a traditional pencil and paper questionnaire. When paper versions of the questionnaire are used, they will be transmitted to and from respondents by fax and postal mail, based on respondents’ preferences. In addition, the email address for the contractor (Westat) responsible for answering respondent questions will be included on the front of the questionnaire. These procedures are all designed to minimize the burden on respondents. For example, the use of various modes of communication will allow respondents flexibility in completing the survey and obtaining clarification on any data collection issues that may arise.



A.4. Duplication


An extensive review of pertinent literature and on-line resources found no documentation of nationwide studies or other data collection activities to identify the assessments/tests and scores used for determining student need for remedial/developmental mathematics and reading classes in postsecondary institutions.


The review consisted of the following activities under a previous contract with ACT. The Director of ACT’s Information Resource Center conducted a broad, on-line review to identify any source of nationally representative studies or sources that might already collect the information of relevance to this study; she found no such source. ACT uses the Institutional Data Questionnaire (IDQ) to annually collect a wide variety of information about almost all two-year, four-year, and other postsecondary institutions in the United States and in some foreign countries; this information base is routinely updated. The IDQ contains data on the tests/assessments, but not the scores, used by postsecondary institutions for placement in the general subject areas of English, mathematics, reading, and science and the ACT scores used for placement in selected English, mathematics, reading, and science courses. The IDQ data do not include information about placement into remedial courses and the data are not necessarily nationally representative. Therefore, the IDQ is not a source of the comprehensive data needed for this study. Based on these findings, the proposed study will not collect data that duplicates information from any existing source.



A.5. Impact on Small Business


The information collection in the pilot test and full-scale survey does not affect small businesses or other small entities.



A.6. Consequence if Collection Not Conducted


The pilot test is essential to identify potential problems in data collection for the full-scale survey on the use of tests and test scores to identify students for remedial/developmental reading and mathematics courses. A survey pretest was conducted with seven institutions; it revealed a number of problems with respondents’ interpretation of questionnaire items and significant variation in the use of tests and test scores to identify student need for remediation. Additional feedback from expert panels underscored the complexity of the issues and the need for additional testing of the survey instrument.


Information from the full-scale survey is a key component in determining what knowledge, skills, and ability in reading and mathematics constitute preparedness for entry into college credit course work. As such, it is essential to collect these data.





A.7. Special Circumstances


There are no special circumstances that would cause the information collection in the pilot test or full-scale survey to be conducted in a manner consistent with any of the instances cited.



A.8. Federal Register Notice and Comments; Efforts to Consult with Persons Outside the Agency


The agency’s original 60-day Federal Register notice seeking public comment on the information collection for the full-scale survey was published on December 16, 2008 on Page 76350. Two individuals responded. There were no comments received about burden hours.


One commenter wrote as follows: “i [sic] do not think the information for the public is worth this collection effort. it [sic] costs too many tax dollars for non productive information.” No changes were made as a result of this comment. A second commenter made suggestions for minor edits on the proposed survey in its form at that time, which resulted in changes to those items.


In addition to seeking public comment via the Federal Register notice, there are several measures to obtain comments from individuals outside the Agency. To date, survey development on the study has benefited from feedback from potential respondents at postsecondary institutions and from expert panels.


Pretest of Questionnaire: To obtain feedback from potential respondents, a draft questionnaire was pretested with a total of seven institutions. The pretest respondents were asked to review the questionnaire and provide feedback about 1) the clarity of the project’s purpose as described on the instrument; 2) the clarity of instructions; 3) the clarity, quality, and appropriateness of the items to collect the information necessary for the study; 4) an estimation of the time necessary for completing the instrument; and 5) any suggestions for determining the entry-level programs for which respondents will be asked to provide course placement information. In addition, those participating in this review phase were asked to provide any knowledge of other existing sources of the data needed for this study, including the name of the source and, if known, the frequency of collection, reporting format, and data elements recorded. The pretest findings pointed to problems with respondents’ interpretation of some questionnaire items and definitions. Another major finding was that the survey did not provide adequate coverage of the various approaches used by institutions to evaluate student need for remediation.


Feedback from the expert panels: Survey development also benefited from the input of subject-matter experts in two stages. Prior to the pretest, the Governing Board and Westat convened a 1-day meeting with a Technical Review Panel to discuss questionnaire and sampling issues. To further explore questionnaire issues that were revealed during the pretest, feedback was also sought from a panel of content experts on the topic. A total of seven content experts participated in a conference call to provide feedback on two versions of a draft questionnaire. Findings from the discussion confirmed the use of varied and complex approaches to evaluate student preparedness and the potential for further refinement of the questionnaire.



A.9. Payments or Gifts to Respondents


No payment or gift will be provided as incentive to respond to the pilot test or full-scale survey.



A.10. Assurance of Confidentiality to Respondents


An assurance of confidentiality will be contained in the pre-notification letter, cover letters, reminder postcard, and survey instrument for the pilot test and full-scale survey. Westat is an outside agency bringing to the study its recognized reputation as an organization that maintains strict confidentiality of data. The assurance of confidentiality will be an incentive to participate on the part of potential respondents; there is no agency regulation or policy that requires confidentiality of the test scores used for placement of students in postsecondary education and, in many cases, this information is publicly available.


All Westat staff members working on the study are required to sign Westat’s confidentiality pledge, which appears as Exhibit 1.



Exhibit 1. Westat confidentiality statement


WESTAT, INC.

EMPLOYEE OR CONTRACTOR'S ASSURANCE OF CONFIDENTIALITY OF SURVEY DATA


Statement of Policy


Westat is firmly committed to the principle that the confidentiality of individual data obtained through Westat surveys must be protected. This principle holds whether or not any specific guarantee of confidentiality was given at time of interview (or self-response), or whether or not there are specific contractual obligations to the client. When guarantees have been given or contractual obligations regarding confidentiality have been entered into, they may impose additional requirements which are to be adhered to strictly.


Procedures for Maintaining Confidentiality


1 All Westat employees and field workers shall sign this assurance of confidentiality. This assurance may be superseded by another assurance for a particular project.


2. Field workers shall keep completely confidential the names of respondents, all information or opinions collected in the course of interviews, and any information about respondents learned incidentally during field work. Field workers shall exercise reasonable caution to prevent access by others to survey data in their possession.


3. Unless specifically instructed otherwise for a particular project, an employee or field worker, upon encountering a respondent or information pertaining to a respondent that s/he knows personally, shall immediately terminate the activity and contact her/his supervisor for instructions.


4. Survey data containing personal identifiers in Westat offices shall be kept in a locked container or a locked room when not being used each working day in routine survey activities. Reasonable caution shall be exercised in limiting access to survey data to only those persons who are working on the specific project and who have been instructed in the applicable confidentiality requirements for that project.


Where survey data have been determined to be particularly sensitive by the Corporate Officer in charge of the project or the President of Westat, such survey data shall be kept in locked containers or in a locked room except when actually being used and attended by a staff member who has signed this pledge.


5. Ordinarily, serial numbers shall be assigned to respondents prior to creating a machine-processible record and identifiers such as name, address, and Social Security number shall not, ordinarily, be a part of the machine record. When identifiers are part of the machine data record, Westat's Manager of Data Processing shall be responsible for determining adequate confidentiality me assures in consultation with the project director. When a separate file is set up containing identifiers or linkage information which could be used to identify data records, this separate file shall be kept locked up when not actually being used each day in routine survey activities.


6. When records with identifiers are to be transmitted to another party, such as for keypunching or key taping, the other party shall be informed of these procedures and shall sign an Assurance of Confidentiality form.


7. Each project director shall be responsible for ensuring that all personnel and contractors involved in handling survey data on a project are instructed in these procedures throughout the period of survey performance. When there are specific contractual obligations to the client regarding confidentiality, the project director shall develop additional procedures to comply with these obligations and shall instruct field staff, clerical staff, consultants, and any other persons who work on the project in these additional procedures. At the end of the period of survey performance, the project director shall arrange for proper storage or disposition of survey data including any particular contractual requirements for storage or disposition. When required to turn over survey data to our clients, we must provide proper safeguards to ensure confidentiality up to the time of delivery.


8. Project directors shall ensure that survey practices adhere to the provisions of the U.S. Privacy Act of 1974 with regard to surveys of individuals for the Federal Government. Project directors must ensure that procedures are established in each survey to inform each respondent of the authority for the survey, the purpose and use of the survey, the voluntary nature of the survey (where applicable) and the effects on the respondents, if any, of not responding.


PLEDGE

I hereby certify that I have carefully read and will cooperate fully with the above procedures. I will keep completely confidential all information arising from surveys concerning individual respondents to which I gain access. I will not discuss, disclose, disseminate, or provide access to survey data and identifiers except as authorized by Westat. In addition, I will comply with any additional procedures established by Westat for a particular contract. I will devote my best efforts to ensure that there is compliance with the required procedures by personnel whom I supervise. I understand that violation of this pledge is sufficient grounds for disciplinary action, including dismissal. I also understand that violation of the privacy rights of individuals through such unauthorized discussion, disclosure, dissemination, or access may make me subject to criminal or civil penalties. I give my personal pledge that I shall abide by this assurance of confidentiality.


Signature


A.11. Questions of a Sensitive Nature


There are no questions on sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.



A.12. Estimates of the Hour Burden


For the pilot test, approximately 120 postsecondary institutions will be contacted and asked to respond to the survey in the pilot test only one time (table A-1). At a response rate of 85 percent, the initial sample will yield about 100 completed questionnaires. Based on a response burden of approximately 45 minutes per completed questionnaire, the estimated response burden to complete the questionnaire is about 75 hours. It is anticipated that about 25 percent of the sample will have returned the completed survey before nonresponse follow up begins and about 75 percent of the sample (i.e., 90 respondents) will receive a nonresponse followup call that takes about 5 minutes. The total estimated burden time for nonresponse follow up is about 8 hours. The total number of burden hours for data collection and nonresponse follow up is estimated at about 83 hours.


Table A-1. Estimated burden for data collection and nonresponse follow up for the pilot test


Type of Collection

Sample size

Estimated response rate (percent)

Estimated number of responses

Total burden hours per respondent

Respondent Burden Hours


Questionnaire

120

85

100

.75

75

Nonresponse follow-up call

120

75

90

.083

8


Total burden = 83 hours







For the full-scale survey, approximately 1,670 postsecondary institutions will be contacted and asked to respond to the survey only one time (table A-2). At a response rate of 85 percent, the initial sample of will yield about 1,420 completed questionnaires. Based on a response burden of approximately 45 minutes per completed questionnaire, the estimated response burden to complete the questionnaire is about 1,065 hours (exhibit 1). It is anticipated that about 25 percent of the sample will have returned the completed survey before nonresponse follow up begins and about 75 percent of the sample (i.e., 1,250 respondents) will receive a nonresponse follow up call that takes about 5 minutes. The total estimated burden time for nonresponse follow up is about 105 hours. The total number of burden hours for data collection and nonresponse follow up is about 1,170 hours.


Table A-2. Estimated burden for data collection and nonresponse follow up for the full-scale survey


Type of Collection

Sample size

Estimated response rate (percent)

Estimated number of responses

Total burden hours per respondent

Respondent Burden Hours


Questionnaire

1,670

85

1,420

.75

1,065

Nonresponse follow-up call

1,670

75

1,250

.083

105


Total burden = 1,170 hours






A.13. Total Annual Cost Burden to Respondents


There will be no total annual cost burden to respondents resulting from the collection of information.



A.14. Annualized Costs to the Federal Government


The survey will be conducted under a contract that has already been awarded. The total estimated cost of this project is $499,000. The contract budget is based on personnel hours, printing, mailing expenses, and computer support and analysis.



A.15. Reasons for Program Changes


The increase in burden is due to a need for information, otherwise unavailable, that is associated with the agency mission of improving the form, content, use, and reporting of results reported to the public by the National Assessment of Educational Progress. The information to be collected will help validate statements about the preparedness of 12th grade students for postsecondary education and training.



A.16. Plans for Publication


There are no plans to publish the results of the pilot test; the results from the pilot test will be used to refine the questionnaire for the full-scale survey.


The results from the full-scale survey will be used along with other research to serve as validity evidence to support statements about 12th grade student preparedness for postsecondary education and training to be made in NAEP reports. Survey responses will be weighted to produce national estimates. Most of the analyses of the questionnaire data will be descriptive in nature, providing the Governing Board with estimate and standard error tables. Tabulations will be produced for each data item. Crosstabulations of data items will be made with selected classification variables. These include institutional characteristics, such as the following.

  • Institution level;

  • Institution control;

  • Enrollment size; and

  • Selectivity of four-year institutions.



A.17. Approval to Not Display the Expiration date of OMB Approval


Such approval is not being sought.



A.18. Exceptions to the Certification Statement


There are no exceptions to the certification statement.


8


File Typeapplication/msword
File TitleSupporting Statement: Survey of Placement Tests and Cut-Scores in Higher Education Institutions
AuthorRay.Fields
Last Modified By#Administrator
File Modified2010-06-17
File Created2010-06-17

© 2024 OMB.report | Privacy Policy