Response to OMB Comments

Response to OMB Comments.doc

National Study on Alternative Assessments (NSAA) Teacher Survey

Response to OMB Comments

OMB: 1850-0860

Document [doc]
Download: doc | pdf

October 6, 2008 (revised October 23, 2008)


TO: Kathy Axt, RIMS

FROM: David Malouf, NCSER, IES

RE: OMB Questions of September 29, 2008 to 200808-1850-004: NSAA Teacher Study

Added 10/23/2008: Responses to OMB’s questions of October 16, 2008


OMB’s questions of 9/29 are in italics, and IES responses to OMB’s 9/29 questions are in normal font. IES responses to OMB’s 10/16 comments are labeled and in brackets.


1. Please provide a summary of the results from the earlier phases of the congressionally-mandated studies.

Several studies were mandated by Section 664 of the Individuals with Disabilities Education Act (IDEA) as reauthorized in 2004. These studies include the Section 664(c) “Study on Ensuring Accountability for Students Who Are Held to Alternative Achievement Standards” which is the focus of the current information collection request. This study is being administered by the National Center for Special Education Research (NCSER) through a project called the National Study on Alternate Assessments (NSAA). In addition, there is a program of studies comprising the Section 664(b) “Assessment of National Activities” which are being administered by the National Center for Education Evaluation and Regional Assistance (NCEE) and are not included in the current information collection request.

These studies are at different stages of research planning, data collection, data analysis, and report preparation, but none has reached a point of having reports or findings completed and cleared by IES for public release. It is our understanding that information shared with OMB may be available to the public, and we foresee potential difficulties in making draft reports or preliminary findings publicly available. Therefore, we respectfully request that OMB allow us to respond to this item at a later date after findings and reports are completed and have been cleared by IES for public release.

2. How exactly does NCSER plan to use the preceding studies and this survey in tandem to answer the legislative requirements? For example, will the findings on assessment validity and reliability be used as variables in interpreting the results of the teacher surveys?

IDEA Section 664(c) called for “a national study or studies to examine (1) the criteria that States use to determine--(A) eligibility for alternate assessments; and (B) the number and type of children who take those assessments and are held accountable to alternative achievement standards; (2) the validity and reliability of alternate assessment instruments and procedures; (3) the alignment of alternate assessments and alternative achievement standards to State academic content standards in reading, mathematics, and science; and (4) the use and effectiveness of alternate assessments in appropriately measuring student progress and outcomes specific to individualized instructional need.”

NSAA data collection activities are designed to focus on different parts of this legislative requirement. The document review and state survey activities, which were previously cleared (OMB Control Number: 1850-0820), primarily addressed the first three requirements (i.e., criteria for eligibility, numbers and types of children, validity and reliability, and alignment to state content standards). We are preparing to report these data in the form of state and national profiles that cover such topics as alignment with academic content standards, alternate assessment approaches, procedures for developing alternate achievement standards, technical quality of assessments, eligibility criteria, and administration, scoring and reporting processes, and student participation and proficiency data. These profiles apply to all fifty states plus the District of Columbia.

The survey covered in the current information collection request focuses primarily on the fourth legislative requirement: “the use and effectiveness of alternate assessments in appropriately measuring student progress and outcomes specific to individualized instructional need.” The data will allow us to examine in states with stable and mature alternate assessments, the degree to which other elements of standards-based reform are being implemented to allow alternate assessments to appropriately measure student progress and outcomes, to respond to individual student needs, and to contribute to possible improvements in student proficiency.

The results of the prior activities were used to identify states to be sampled in the current survey on the basis of the approval status, maturity, and stability of the alternate assessment system. In addition, data from the prior activities will provide contextual information in the analysis and reporting of results from the current survey, recognizing that the design does not allow causal inferences to be drawn between factors identified in prior activities and the findings of the current teacher survey.

[Response to 10/16 request for a matrix of Congressional requirements and sources of information:

Congressional requirements

Source of information

Method used for obtaining information

(a) eligibility for alternate assessments

Document review/state assessment staff

Document analysis/Telephone interview

(b) the number and type of children who take those assessments and are held accountable to alternative achievement standards

Document review/ state assessment staff

Document analysis/Telephone interview

(c) the validity and reliability of alternate assessment instruments and procedures

Document review/ state assessment staff

Document analysis/Telephone interview

(d) the alignment of alternate assessments and alternative achievement standards to State academic content standards in reading, mathematics, and science

Document review/ state assessment staff

Document analysis/Telephone interview

(e) the use and effectiveness of alternate assessments in appropriately measuring student progress and outcomes specific to individualized instructional need.

Teachers

Teacher survey


3. What is the literature behind the presumption that teacher opinions alone are sufficient for gauging the quality of available professional development, curriculum materials, stakeholder understanding of academic content, and other items about which the survey asks?

The decision to use a teacher survey resulted from extended discussion among IES staff, the study's contractor, and the study's technical work group. In our supporting statement we proposed a model of standards-based reform as a framework for studying “the use and effectiveness of alternate assessments in appropriately measuring student progress and outcomes specific to individualized instructional need.” Clearly teachers are central in linking the elements of standards-based reform to student progress and outcomes. For this reason, teachers were the chosen as an appropriate focus that could be surveyed within the time and resource limitations of this study.

Concerns can be raised about the ability of teachers to be accurate reporters of their own circumstances and behavior, particularly when there are social desirability factors embedded within survey content. However, we feel that adequate efforts have been made to design items that make appropriate use of teacher judgment combined with objective information (e.g., types of teacher certification, years of experience, amount of professional development received, amount of instructional time in specific areas for students taking alternate assessments, etc.). Moreover, the instruments that are the basis of the planned survey – the Curriculum Indicator Survey (CIS) and the Learner Characteristics Inventory (LCI) – have been tested in a number of states and have shown adequate technical characteristics.

Finally, there are numerous examples of previous research studies based on teacher surveys, including the recently-completed national Study of Personnel Needs in Special Education (SPeNSE) (Carlson, et al., 2002); several NCES Schools and Staffing Survey (SASS) reports on topics such as professional development (Scotchmer, McGrath, & Coder, 2005) and classroom practices (Henke, Xianglei, & Goldman, 1999); and a number of specific research studies on teacher perceptions and instructional practice (Ross, McDougall, Hogaboam-Gray, & LeSage, 2003; Clunies-Ross, Little, & Kienhuis, 2008; Corbell, Reiman, & Nietfeld, 2008).

[Response to 10/16 question about other approaches that were considered: We considered various types of data collection instruments and methods. Regarding instruments, we considered open-ended interviews but decided to use surveys consisting of closed-ended items to facilitate the analysis and interpretation of data. Our previous data collection activities for the document review and state survey were largely open-ended and required extensive coding of data. This coding process helped inform the development of the closed-ended survey items for our current collection. Regarding methods, we considered sampling individual students who participated in alternate assessments and conducting in-depth interviewing of teachers, district and school-level administrators, and parents. However, we decided against this approach because we don’t have sufficient resources to achieve an adequate sample size for this type of data collection. We feel that our approach of surveying a representative sample of teachers who participate in alternate assessments in states with mature and stable alternate assessments is a technically sound and cost effective approach for studying the topics we have proposed.]

4. What level of generalizability does NCSER plan to indicate it has with a non-randomly selected sample of 3 states?  Why did NCSER select 3 states rather than sampling from all eligible states? 

There were several factors that led to our sampling approach. First, it is an expensive enterprise to survey a representative sample of teachers who have provided alternate assessments based on alternate achievement standards to students. The development of the sampling frame and availability of data to construct the frame are contributing factors. NCSER does not have the resources to conduct the survey in more than 3 states. In addition, the sample was constrained by the number of available states which met the criteria of having approved and stable alternate assessment systems, and available lists of eligible teachers to construct the teacher sample.

NCSER is not making an argument that the findings of this survey will generalize nationally, as it might with a larger, randomly-selected sample. In fact, it would be challenging and expensive to construct a nationally-representative sample in the current context of change and diversity in state alternate assessments. Instead, NCSER intends to present each of the states in this survey as an illustrative example or case.

NCSER believes that the findings will both meet the congressional intent and be useful to the field.

5. Incentives

1. Absent evidence from specific studies of similar populations with similar methodology and burden requiring incentives at this level, OMB does not consider the incentive levels proposed to be justified.  Please revisit the proposed incentive strategy to bring it better into alignment with IES and OMB guidance (e.g., $30 for a high burden teacher survey).

The proposed incentives were based on National Center for Education Evaluation and Regional Assistance (NCEE) guidelines. The target population who would complete this survey is quite small with specialized qualifications, and the survey places a high level of demand in terms of time and expertise. Based on the NCEE guidelines a $95 incentive is within reason. All teachers are asked to complete a basic screening for $5, which will inform the respondents if they are appropriate to complete the entire survey. The entire survey is estimated to take 120 minutes. According to the NCEE guidelines, a $30 incentive would be appropriate for a “high burden” survey, but these guidelines describe a high burden survey as a “30 minute survey of detailed information on instructional practice, school-level interventions, or parent/student histories and experiences.” The NSAA survey will require 4 times as long, justifying a proportionally higher incentive of $95. NCSER respectfully requests that OMB approve this incentive to allow us to achieve adequate response rates and data quality.

[Response to 10/16 comment about our proposed incentives: We propose to eliminate parts 3 and 4 of the survey and reduce the estimated burden from 120 minutes to 60 minutes. With this reduced burden, we feel that $40 will be an effective incentive, and is consistent with NCEE guidelines for a high-burden teacher survey ($30) combined with a high-burden rating of one student ($10). This is discussed in our revised Supporting Statement Part A. We propose to deliver the incentive in two parts, $5 to be attached to the survey as a token incentive to evoke a sense of obligation (as discussed in our Supporting Statement Part A), and $35 in the form of a check to be mailed to the teacher after he/she returns the completed survey. Research and experience suggest that the immediate delivery of a token incentive followed by a larger incentive for survey completion is an effective strategy for increasing response rates. We have revised the supporting statements and the letters to states and teachers to reflect the new burden estimate and incentive. Also, we have reworded the letters to clarify that the $5 is not an incentive for completing the screening questionnaire, but is instead a “thank you” for considering our request that they participate in the survey.]

2. Related, absent any specific cost data from a similar study, NCSER should not assert that this incentive will pay for itself.  Much of the literature demonstrates the opposite.

NCSER does not intend to assert that the incentive will “pay for itself”. We recognize that the second sentence in the final paragraph of A9 (“Given our experience with this type of incentive, the cost will be largely offset by a reduction in costs associated with follow-up and nonresponse conversion.”) appears to make this assertion, so we propose to delete this paragraph except for the first sentence (“Exhibit 4 shows the cost of this incentive program.”).

6. Survey instrument

1. Given the burden of this instrument, since the survey indicates that its focus is math and reading, why ask questions about science? 

The introduction to the survey currently includes the following language, “…this survey is concerned primarily with the subject areas of Reading/English language arts and mathematics.” Given that science is included in the survey, we propose to revise the language to, “…this survey is concerned primarily with the subject areas of reading/English language arts, mathematics, and science.” Because the statutory language mandating this study includes science as one of the areas of focus, questions relating to science should be included. In addition, federal regulations implementing NCLB provisions for state assessments require that “alternate assessments must yield results in at least reading/language arts, mathematics, and, beginning in the 2007–2008 school year, science.” (Federal Register July 5, 2002 Vol. 67, No. 129 pp 45041-45042, 34 C.F.R. §200). To reduce burden, science has already been removed from the final section of the survey (pages asking about intensity of coverage, expectations, instructional time, and student participation), and we feel that the remaining science items should be retained.

2. What other content can NCSER trim?

A number of steps have been taken to make this survey as concise as possible. First, a conceptual model was drafted as a framework for our survey, and components were eliminated that did not fit the conceptual model. In addition, the NSAA TWG reviewed and suggested places to shorten the survey, and then the NSAA team worked to eliminate any extraneous questions. The NSAA survey incorporates questions from the CIS and LCI, which should not be abbreviated to maintain the integrity of those instruments. In our opinion, any additional trimming of content will result in only minor reduction in burden at a possible cost in data quality.

3. Please confirm that the program does not apply to children younger than 8 or in a grade level lower than 3 (see item 2.1 on survey).

Yes, we are only looking at children who are 8 and older and in grade level 3 and higher. 

4. “Kindergarten” is missing from the pick list in item 2.4.

Kindergarten was inadvertently left off the pick list and will be added. 

7. Confidentiality

1. Please indicate the specific statute under which these data will be collected and the applicable confidentiality statute in Section A10 of the supporting statement.

We propose to add the following language at the beginning of section A10 before the language which is currently in that section:

“Respondents are assured that confidentiality will be maintained, except as required by law. The following statement concerning confidentiality will be included in the letters to respondents:

The collection of information in this study is authorized by Public Law 108-446, Section 664(c). Participation is voluntary. Your responses are protected from disclosure by federal statute (PL 107-279 Title I, Part C, Sec. 183). All responses that relate to or describe identifiable characteristics of individuals may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, unless otherwise compelled by law. Data will be combined to produce statistical reports. No individual data that links your name, address, telephone number, or identification number with your responses will be included in the statistical reports.

The design of the study addresses state and local concerns regarding the Family Educational Rights and Privacy Act (FERPA) and operates in accordance with the Privacy Act of 1974, as amended, (5 U.S.C. 552a). NSAA data are gathered exclusively for statistical and research purposes, without identifying individuals. Specific steps to guarantee confidentiality are discussed below.”

[End of proposed new language. The next paragraph will be as in the current draft, beginning with "SRI, Policy Studies Associates (PSA), and the University of Minnesota are dedicated...", revised as per item 9 below.]

[Response to 10/16 comment on not using “guarantee”: We have replaced “guarantee” with “help preserve”]

2. Where is the confidentiality pledge missing from the teacher questionnaire?

The questionnaire as originally submitted to OMB included the following language on about confidentiality the final page: “This page will be removed from your recorded responses. None of your responses will be related to you personally. All results will be analyzed and reported for responses as a group.” NCSER proposes to retain this language on the final page. In addition, NCSER proposes to add the following language to the first page of the questionnaire, in a separate line under the Paperwork Burden Statement: “Data will be combined to produce statistical reports. All responses will be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, unless otherwise compelled by law.”

8. Will teachers receive the mailing packages at school or at home?

Teachers will receive materials at school.

9. Please revise the supporting statement to reflect that the submission is from the Department, not from a contractor, e.g., page 9 of part B.

We propose to make the following clarifying revisions to Supporting Statement Part A and Supporting Statement Part B:

Supporting Statement Part A:

Page 13

Current first sentence in A10: Change “SRI, Policy Studies Associates (PSA), and the University of Minnesota are dedicated…” to “ED and its contractors are dedicated…”

Supporting Statement Part B:

Page 9

Third bullet (That begins with “Following the discussions with the pilot test participants…”): Change “NSAA provided ED with a list of proposed changes…” to “NSAA discussed possible changes…”

First paragraph after Exhibit 8: Change “The list of survey revisions shown in this document is a result of this discussion with ED. According to the ED, no feedback on the draft survey or design was received from the public.” to “The list of survey revisions shown in this document is a result of this discussion. No feedback on the draft survey or design was received from the public.”

Page 10

Last sentence before Exhibit 10: Change “Exhibit 10 summarizes the changes that have been made to the survey as a result of the pilot test and discussion with ED.” to “Exhibit 10 summarizes the changes that have been made to the survey as a result of the pilot test.”

References Used in Response to Item 3

Carlson, E., Brauen, M., Klein, S., Schroll, K., & Willig, S. (2002). The Study of Personnel Needs in Special Education: Key Findings. Rockville, MD: Westat.

Clunies-Ross, P, Little, E, & Kienhuis, M. (2008). Self-reported and actual use of proactive and reactive classroom management strategies and their relationship with teacher stress and student behavior. Educational Psychology, 28 (6), 693-710.

Corbell, K.A, Reiman, A.J., & Nietfeld, J. (2008). The Perceptions of Success Inventory for Beginning Teachers: Measuring its psychometric properties, Teaching and Teacher Education. 24 (6), 1551-1563.

Henke, R.R., Xianglei, C., & Goldman, G. (1999) What Happens in Classrooms? Instructional Practices in Elementary and Secondary Schools, 1994–95. (NCES 1999-348). U.S. Department of Education, National Center for Education Statistics. Washington, DC.

Ross, J.A., McDougall, D., Hogaboam-Gray, A., & LeSage, A. (2003). A survey measuring elementary teachers' implementation of standards-based mathematics teaching. Journal for Research in Mathematics Education. 34 (4), 344-353.

Scotchmer, M., McGrath, D. J., & Coder, E. (2005). Characteristics of Public School Teachers' Professional Development Activities: 1999-2000. (NCES 2005–030), U.S. Department of Education, National Center for Education Statistics. Washington, DC.

File Typeapplication/msword
AuthorDavid Malouf
Last Modified By#Administrator
File Modified2008-10-27
File Created2008-10-27

© 2024 OMB.report | Privacy Policy