Justification Section A Operational Study

NAGB OMB Section A (Revised 1-10-11).docx

Evaluating Student Need for Developmental or Remedial Courses at Postsecondary Education Institutions (Formerly titled Survey of Placement Tests and Cut-Scores in Higher Education Institutions).

Justification Section A Operational Study

OMB: 3098-0006

Document [docx]
Download: docx | pdf

Request for OMB Clearance

Evaluating Student Need for Developmental or Remedial Courses

at Postsecondary Education Institutions

(Formerly titled Survey of Placement Tests and Cut-Scores in Higher Education Institutions).



Supporting Statement for Paperwork Reduction Act Submissions


REVISED 1-10-11



Supporting Statement A: Justification



A.1. Circumstances Necessitating the Information Collection


The congressionally authorized National Assessment of Educational Progress (NAEP) is the only continuing source of comparable national and state data available to the public on the achievement of students at grades 4, 8, and 12 in core subjects. The National Assessment Governing Board oversees and sets policy for NAEP. NAEP and the Governing Board are authorized under the National Assessment of Educational Progress Authorization Act (P.L.107-279).


Among the Board’s responsibilities is “to improve the form, content, use, and reporting of [NAEP results].” Toward this end, the Governing Board established a national commission to make recommendations to improve the assessment and reporting of NAEP at the 12th grade. In its March 2004 report, the commission noted the importance of maintaining NAEP at the 12th grade as a measure of the “output” of K-12 education in the United States and as an indicator of the nation’s human capital potential. The commission recommended that 12th grade NAEP be redesigned to report on the academic preparedness of 12th grade students in reading and mathematics for entry level college credit coursework. The commission concluded that having such information is essential for the economic well being and security of the United States and that NAEP is uniquely positioned to provide such information.


As the Governing Board has been developing ways to implement the commission’s recommendations, there has been a wider recognition—among federal and state policymakers, educators, and the business community—of the importance of a rigorous high school program that results in meaningful high school diplomas and prepares students for college and for training for good jobs.


As part of implementing the commission’s recommendations, the Governing Board has planned a program of research studies to support the validity of statements about 12th grade student preparedness in reading and mathematics. Among the studies planned is a proposed survey of two-year and four-year postsecondary institutions about the use of tests and test scores for placing first-year students into entry-level credit bearing courses and into remedial/developmental courses in mathematics and reading. The data resulting from this survey will be used, along with the results of the other planned studies, to help develop valid statements that can be made about the preparedness of 12th grade students in NAEP reports.


Exploratory survey development work with expert panels and the results from a pretest of the survey instrument indicate that approaches used by postsecondary institutions to evaluate student preparedness are complex and varied, especially with regards to the use of various tests in combination with other evaluation criteria. Thus, the Governing Board requested and obtained OMB clearance for a pilot test consisting of a survey to 120 sampled two-year and four-year postsecondary institutions. The pilot test was conducted in fall 2010 and a report of findings is attached (Attachment A). The purpose of the pilot test was to identify potential problems that may emerge from data collection during the full-scale survey. In addition, the pilot test provided insights into the extent to which institutions use various tests to identify student need for remediation in mathematics and reading. Two slightly different versions of the questionnaire was tested, one for two-year institutions and the other for four-year institutions. The only difference between the two versions of the survey was that two-year institutions was asked to consider placement polities affecting entering students in programs that were designed to transfer to a four-year institution, while four-year institutions was asked to consider entering students who were enrolled in an undergraduate degree program in the liberal arts and sciences. The survey was limited to four pages of questions. It was based on information readily available to respondents and could be completed by most individuals in about 45 minutes. The cover letter included information about the option to complete a web version of the survey. The pilot test survey was mailed to institutions in October 2010.



The Governing Board is now requesting clearance for the full-scale data collection with about 1,700 sampled two-year and four-year postsecondary institutions. As with the pilot test, the full-scale study will be a self-administered survey addressed to the President of the institution with a cover letter requesting that the survey be completed by the appropriate individual or office. The package will also contain a page of information about the Governing Board and how this survey fits into its program of study on the preparedness of 12th graders.


Respondents will have the option of completing a web version that will be accessed through the Internet or a traditional paper and pencil questionnaire. Two slightly different versions of the questionnaire will be mailed to institutions, one for two-year institutions (see Attachment B-1) and the other for four-year institutions (see Attachment B-2). The only difference between the two versions of the survey is that two-year institutions will be asked to consider placement polities affecting entering students in programs that are designed to transfer to a four-year institution, while four-year institutions will be asked to consider entering students who are enrolled in an undergraduate degree program in the liberal arts and sciences. The survey will be limited to four pages of questions. It will be based on information readily available to respondents and can be completed by most individuals in about 30 minutes. The cover letter (see Attachment C) will include information about the option to complete a web version of the survey. The full-scale survey will be mailed to institutions in late February or early March 2011.



A.2. Purpose of the Information


Information from the pilot test was used to refine the questionnaires to ensure that they are clearly worded, minimize respondent burden, and provide adequate coverage of various types of tests and test scores used by institutions to determine student need for remediation in mathematics and reading (see details in section A.1). The pilot test allowed for rigorous testing of questionnaire issues and survey administration because it mirrored the data collection procedures for the main study. This will help to avoid potential problems in the main data collection. The pilot study also provided insights into the extent to which institutions use various tests, either independently or in combination with other evaluation criteria to identify student need for remediation in mathematics and reading.


The Governing Board will use the information from the full-scale survey of higher education institutions, along with the results of other planned research, to serve as validity evidence to develop and support statements to be made in NAEP reports about 12th grade student preparedness. To this end, the survey will collect information on tests and test scores used by two-year and four-year postsecondary institutions to identify student need for remediation in mathematics and reading.



A.3. Use of Information Technology


As in the pilot test, sampled institutions in the full-scale survey will be encouraged to complete a web version of the questionnaire accessed through the Internet. Institutions will be also given the option of completing the survey using a traditional pencil and paper questionnaire. When paper versions of the questionnaire are used, they will be transmitted to and from respondents by fax and postal mail, based on respondents’ preferences. In addition, the email address for the contractor (Westat) responsible for answering respondent questions will be included on the front of the questionnaire. Westat will also use mass email reminders to prompt nonrespondents to complete the survey. These procedures are all designed to minimize the burden on respondents. For example, the use of various modes of communication will allow respondents flexibility in completing the survey and obtaining clarification on any data collection issues that may arise.



A.4. Duplication


An extensive review of pertinent literature and on-line resources found no documentation of nationwide studies or other data collection activities to identify the assessments/tests and scores used for determining student need for remedial/developmental mathematics and reading classes in postsecondary institutions.


The review consisted of the following activities under a previous contract with ACT. The Director of ACT’s Information Resource Center conducted a broad, on-line review to identify any source of nationally representative studies or sources that might already collect the information of relevance to this study; she found no such source. ACT uses the Institutional Data Questionnaire (IDQ) to annually collect a wide variety of information about almost all two-year, four-year, and other postsecondary institutions in the United States and in some foreign countries; this information base is routinely updated. The IDQ contains data on the tests/assessments, but not the scores, used by postsecondary institutions for placement in the general subject areas of English, mathematics, reading, and science and the ACT scores used for placement in selected English, mathematics, reading, and science courses. The IDQ data do not include information about placement into remedial courses and the data are not necessarily nationally representative. Therefore, the IDQ is not a source of the comprehensive data needed for this study. Based on these findings, the proposed study will not collect data that duplicates information from any existing source.



A.5. Impact on Small Business


The information collection in the pilot test and full-scale survey does not affect small businesses or other small entities.



A.6. Consequence if Collection Not Conducted


Information from the full-scale survey is a key component in determining what knowledge, skills, and ability in reading and mathematics constitute preparedness for entry into college credit course work. The information to be collected will help validate statements about the preparedness of 12th grade students for postsecondary education and training. As such, it is essential to collect these data.



A.7. Special Circumstances


There are no special circumstances that would cause the information collection in the pilot test or full-scale survey to be conducted in a manner consistent with any of the instances cited.



A.8. Federal Register Notice and Comments; Efforts to Consult with Persons Outside the Agency


The agency’s original 60-day Federal Register notice seeking public comment on the information collection for the full-scale survey was published on December 16, 2008 on Page 76350. Two individuals responded. There were no comments received about burden hours.


One commenter wrote as follows: “i [sic] do not think the information for the public is worth this collection effort. it [sic] costs too many tax dollars for non productive information.” No changes were made as a result of this comment. A second commenter made suggestions for minor edits on the proposed survey in its form at that time, which resulted in changes to those items.


In addition to seeking public comment via the Federal Register notice, there are several measures to obtain comments from individuals outside the Agency. To date, survey development on the study has benefited from feedback from potential respondents at postsecondary institutions, input from expert panels, and feedback from a pilot test of 120 postsecondary institutions.


Pretest of Questionnaire: To obtain feedback from potential respondents, a draft questionnaire was pretested with a total of seven institutions. The pretest respondents were asked to review the questionnaire and provide feedback about 1) the clarity of the project’s purpose as described on the instrument; 2) the clarity of instructions; 3) the clarity, quality, and appropriateness of the items to collect the information necessary for the study; 4) an estimation of the time necessary for completing the instrument; and 5) any suggestions for determining the entry-level programs for which respondents will be asked to provide course placement information. In addition, those participating in this review phase were asked to provide any knowledge of other existing sources of the data needed for this study, including the name of the source and, if known, the frequency of collection, reporting format, and data elements recorded. The pretest findings pointed to problems with respondents’ interpretation of some questionnaire items and definitions. Another major finding was that the survey did not provide adequate coverage of the various approaches used by institutions to evaluate student need for remediation.


Feedback from the expert panels: Survey development also benefited from the input of subject-matter experts in two stages. Prior to the pretest, the Governing Board and Westat convened a 1-day meeting with a Technical Review Panel to discuss questionnaire and sampling issues. To further explore questionnaire issues that were revealed during the pretest, feedback was also sought from a panel of content experts on the topic. A total of seven content experts participated in a conference call to provide feedback on two versions of a draft questionnaire. Findings from the discussion confirmed the use of varied and complex approaches to evaluate student preparedness and the potential for further refinement of the questionnaire.


Pilot test of Questionnaire: A pilot test was conducted with 120 postsecondary institutions to explore questionnaire issues and potential hurdles to full-scale data collection. The findings, summarized in the attached report (Attachment A), were used to inform changes to the survey instrument and data collection approaches.



A.9. Payments or Gifts to Respondents


No payment or gift will be provided as incentive to respond to the pilot test or full-scale survey.



A.10. Assurance of Confidentiality to Respondents


As in the pilot test, the following statement of data confidentiality will be contained in the cover letter and survey instruments for the full-scale study:


The information provided by your institution will be kept private to the extent permitted by law.  Data for this study will be reported in aggregate form; the information provided by your institution will be combined with other participating institutions to produce statistical summaries and reports. 


Westat is an outside agency bringing to the study its recognized reputation as an organization that maintains strict confidentiality of data. The confidentiality statement will be an incentive to participate on the part of potential respondents; there is no agency regulation or policy that requires confidentiality of the test scores used for placement of students in postsecondary education and, in many cases, this information is publicly available.


All Westat staff members working on the study are required to sign Westat’s confidentiality pledge, which appears as Exhibit 1.



Exhibit 1. Westat confidentiality statement


WESTAT, INC.

EMPLOYEE OR CONTRACTOR'S ASSURANCE OF CONFIDENTIALITY OF SURVEY DATA


Statement of Policy


Westat is firmly committed to the principle that the confidentiality of individual data obtained through Westat surveys must be protected. This principle holds whether or not any specific guarantee of confidentiality was given at time of interview (or self-response), or whether or not there are specific contractual obligations to the client. When guarantees have been given or contractual obligations regarding confidentiality have been entered into, they may impose additional requirements which are to be adhered to strictly.


Procedures for Maintaining Confidentiality


1 All Westat employees and field workers shall sign this assurance of confidentiality. This assurance may be superseded by another assurance for a particular project.


2. Field workers shall keep completely confidential the names of respondents, all information or opinions collected in the course of interviews, and any information about respondents learned incidentally during field work. Field workers shall exercise reasonable caution to prevent access by others to survey data in their possession.


3. Unless specifically instructed otherwise for a particular project, an employee or field worker, upon encountering a respondent or information pertaining to a respondent that s/he knows personally, shall immediately terminate the activity and contact her/his supervisor for instructions.


4. Survey data containing personal identifiers in Westat offices shall be kept in a locked container or a locked room when not being used each working day in routine survey activities. Reasonable caution shall be exercised in limiting access to survey data to only those persons who are working on the specific project and who have been instructed in the applicable confidentiality requirements for that project.


Where survey data have been determined to be particularly sensitive by the Corporate Officer in charge of the project or the President of Westat, such survey data shall be kept in locked containers or in a locked room except when actually being used and attended by a staff member who has signed this pledge.


5. Ordinarily, serial numbers shall be assigned to respondents prior to creating a machine-processible record and identifiers such as name, address, and Social Security number shall not, ordinarily, be a part of the machine record. When identifiers are part of the machine data record, Westat's Manager of Data Processing shall be responsible for determining adequate confidentiality me assures in consultation with the project director. When a separate file is set up containing identifiers or linkage information which could be used to identify data records, this separate file shall be kept locked up when not actually being used each day in routine survey activities.


6. When records with identifiers are to be transmitted to another party, such as for keypunching or key taping, the other party shall be informed of these procedures and shall sign an Assurance of Confidentiality form.


7. Each project director shall be responsible for ensuring that all personnel and contractors involved in handling survey data on a project are instructed in these procedures throughout the period of survey performance. When there are specific contractual obligations to the client regarding confidentiality, the project director shall develop additional procedures to comply with these obligations and shall instruct field staff, clerical staff, consultants, and any other persons who work on the project in these additional procedures. At the end of the period of survey performance, the project director shall arrange for proper storage or disposition of survey data including any particular contractual requirements for storage or disposition. When required to turn over survey data to our clients, we must provide proper safeguards to ensure confidentiality up to the time of delivery.


8. Project directors shall ensure that survey practices adhere to the provisions of the U.S. Privacy Act of 1974 with regard to surveys of individuals for the Federal Government. Project directors must ensure that procedures are established in each survey to inform each respondent of the authority for the survey, the purpose and use of the survey, the voluntary nature of the survey (where applicable) and the effects on the respondents, if any, of not responding.


PLEDGE

I hereby certify that I have carefully read and will cooperate fully with the above procedures. I will keep completely confidential all information arising from surveys concerning individual respondents to which I gain access. I will not discuss, disclose, disseminate, or provide access to survey data and identifiers except as authorized by Westat. In addition, I will comply with any additional procedures established by Westat for a particular contract. I will devote my best efforts to ensure that there is compliance with the required procedures by personnel whom I supervise. I understand that violation of this pledge is sufficient grounds for disciplinary action, including dismissal. I also understand that violation of the privacy rights of individuals through such unauthorized discussion, disclosure, dissemination, or access may make me subject to criminal or civil penalties. I give my personal pledge that I shall abide by this assurance of confidentiality.


Signature


A.11. Questions of a Sensitive Nature


There are no questions on sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.



A.12. Estimates of the Hour Burden


Pilot Test: For the pilot test, 120 postsecondary institutions were contacted and asked to respond to the survey in the pilot test only one time (table A-1). At a response rate of 85 percent, the initial sample was expected to yield about 100 completed questionnaires. Based on a response burden of approximately 45 minutes per completed questionnaire, the estimated response burden to complete the questionnaire was about 75 hours. It was anticipated that about 25 percent of the sample will have returned the completed survey before nonresponse follow up begins and about 75 percent of the sample (i.e., 90 respondents) will have received a nonresponse followup call that takes about 5 minutes. The total estimated burden time for nonresponse follow up was about 8 hours. The total number of estimated burden hours for data collection and nonresponse follow up was estimated at about 83 hours. However, data from the pilot test indicate that the actual average response time was 20 minutes.


Table A-1. Estimated burden for data collection and nonresponse follow up for the pilot test


Type of Collection

Sample size

Estimated response rate/followup required (percent) (percent)

Estimated number of responses

Total burden hours per respondent

Respondent burden hours


Questionnaire

120

85

100

.75

75

Nonresponse follow-up call

120

75

90

.083

8


Total burden = 83 hours









Full-scale data collection: For the full-scale survey, approximately 1,670 postsecondary institutions will be contacted and asked to respond to the survey only one time (table A-2). Based on findings from the pilot test and changes to the questionnaire to encourage respondents to provide comments in the comment boxes, the estimated time to complete the survey is 30 minutes. At a response rate of 85 percent, the initial sample will yield about 1,420 completed questionnaires. Based on a response burden of approximately 30 minutes per completed questionnaire, the estimated response burden to complete the questionnaire is about 710 hours. It is anticipated that about 5 percent of the sample will have returned the completed survey before nonresponse follow up begins and about 95 percent of the sample (i.e., 1,586 respondents) will receive one to four nonresponse follow up calls for an average of 10 minutes per respondent. The total estimated burden time for nonresponse follow up with respondents is about 265 hours. The total number of burden hours for data collection and nonresponse follow up is about 975 hours.



Table A-2. Estimated burden for data collection and nonresponse follow up for the full-scale survey


Type of Collection

Sample size

Estimated response rate/followup required (percent)

Estimated number of responses

Total burden hours per respondent

Respondent Burden Hours


Questionnaire

1,670

85

1,420

.50

710

Nonresponse follow-up call

1,670

95

1,586

.167

265


Total burden = 975 hours








A.13. Total Annual Cost Burden to Respondents


There will be no total annual cost burden to respondents resulting from the collection of information.



A.14. Annualized Costs to the Federal Government


The survey will be conducted under a contract that has already been awarded. The total estimated cost of this project is $562,625. The contract budget is based on personnel hours, printing, mailing expenses, and computer support and analysis.



A.15. Reasons for Program Changes


The increase in burden is due to a need for information, otherwise unavailable, that is associated with the agency mission of improving the form, content, use, and reporting of results reported to the public by the National Assessment of Educational Progress. The information to be collected will help validate statements in NAEP reports about the preparedness of 12th grade students for postsecondary education and training.



A.16. Plans for Publication


The results from the pilot test were used to refine the questionnaire for the full-scale survey.


The results from the full-scale survey will be used along with other research to serve as validity evidence to support statements about 12th grade student preparedness for postsecondary education and training to be made in NAEP reports. Survey responses will be weighted to produce national estimates. Most of the analyses of the questionnaire data will be descriptive in nature, providing the Governing Board with estimate and standard error tables. Tabulations will be produced for each data item. Crosstabulations of data items will be made with selected classification variables. These include institutional characteristics, such as the following.

  • Institution level;

  • Institution control;

  • Selectivity; and

  • Enrollment size.



A.17. Approval to Not Display the Expiration date of OMB Approval


Such approval is not being sought.



A.18. Exceptions to the Certification Statement


There are no exceptions to the certification statement.


11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement: Survey of Placement Tests and Cut-Scores in Higher Education Institutions
AuthorRay.Fields
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy