Att_Vocab OMB Section A 5.25.07.MW

Att_Vocab OMB Section A 5.25.07.MW.doc

The Effectiveness of a Program to Accelerate Vocabulary Development in Kindergarten

OMB: 1850-0846

Document [doc]
Download: doc | pdf









omb package

2.1.1: The Effectiveness of a Program to Accelerate Vocabulary Development in Kindergarten (Vocab)


Supporting Statement Part A



Date Resubmitted:

August 20, 2007


Contract Number:

ED-06-CO-0028



Submitted to:

Gil Garcia

Institute of Education Sciences

U.S. Department of Education


Submitted by:

Ludwig D. van Broekhuizen

REL-Southeast

SERVE Center

915 Northridge Street, Second Floor

Greensboro, NC 27403-2112

(800) 755-3277

(336) 315-7400


TABLE OF CONTENTS

OMB Package Supporting Statement A





Appendix List



  1. Education Sciences Reform Act 2002

  2. Peabody Picture Vocabulary Test-4 (PPVT) Example

  3. Expressive Vocabulary Test-2 (EVT) Example

  4. Protocol for Student Assessments (Lexical Diversity)

  5. Classroom Observation Form

  6. Protocol for Audio Recording Teacher Sample

  7. Teacher Interaction and Language Rating Scale

  8. Fidelity Rating Scale

  9. Implementation Challenges Teacher Interview

  10. Teacher Demographic Questionnaire

  11. Paraprofessional Demographic Questionnaire

  12. Child Data File Extraction Form

  13. Federal Register Announcement

  14. Teacher Consent Form

  15. Parental Permission Form

  16. School Agreement Form

  17. District Agreement Form

  18. Certification of Confidentiality

Appendix S: Abt IRB Approval, UGA IRB Approval, and UNCG IRB Approval

Appendix T: Method Used for Estimating Costs to the Federal Government




Supporting Statement for Request for OMB Approval of Data Collection/Needs Assessment for the REL-SE


Part A. Justification



Introduction


This document presents the Supporting Statement for An Evaluation of the Effectiveness of a Program to Accelerate Vocabulary Development in Kindergarten. The intervention that will be evaluated, PAVEd for Success (PAVE), is an early literacy program designed to enhance vocabulary development among kindergarteners in high-poverty schools. Vocabulary skills are critical for learning to read, as they provide an essential foundation for decoding, fluency, and reading comprehension. Children who live in poverty are more likely to enter school with more poorly developed language skills, including vocabulary (Smith, Brooks-Gunn, & Klebanov, 1997), and continue to fall further behind through elementary school. Correlational research consistently finds that early oral language, vocabulary, and other preliteracy skills are related to later literacy skills, including reading comprehension (Storch & Whitehurst, 2002; Tabors, Snow, & Dickinson, 2001). The goal of intervening early to improve children’s vocabulary skills is to put children who are at risk of poor reading outcomes on a trajectory toward better reading outcomes.


Overview


The PAVE intervention provides teachers with professional development through which they learn research-based strategies for enhancing children’s vocabulary development during interactive book reading; cognitively challenging conversations; and direct vocabulary instruction. Teachers are trained to increase the number and quality of conversations with students, to engage in more active and more frequent small-group book reading, and to use explicit strategies for directly teaching vocabulary. Higher-quality teacher-child conversations involve, for example, a broader diversity of words, more rare words, and more cognitively challenging talk, than tend to be used in commonly occurring conversations about routine and concrete matters. Teachers are trained to engage in frequent and interactive storybook reading and re-reading with children, including asking cognitively-challenging questions, requesting children to predict events and draw conclusions, and making connections to children’s experiences. The training provides teachers with specific skills and techniques for focusing on vocabulary in conversations and book reading in order to enhance children’s learning. The intervention and study will be conducted in a rural area of Mississippi known as the Delta, which is characterized by high poverty and low student achievement.


The overall goal of the intervention is to enhance children’s vocabulary knowledge as a foundation for literacy development; however, we hypothesize that the PAVE intervention affects children’s development through impacts on teaching practice. The evaluation examines two main issues: (1) the impact of the PAVE vocabulary intervention on students’ vocabulary and literacy outcomes and (2) the impact of PAVE on teachers’ vocabulary and broader literacy instructional practices. To address these questions, it is necessary to collect data both on teachers’ instructional practices and students’ vocabulary and broader literacy development. In addition, it is necessary to collect data on teachers’ backgrounds and student demographics to examine whether the intervention is more effective for some groups of teachers or students than for others.


The PAVEd for Success program, funded by the U.S. Department of Education Early Childhood Educator Professional Development (ECEPD) program in 2001-2003 (Co-Principal Investigators Claire Hamilton, Paula Schwanenflugel, and Stacey Neuharth-Pritchett), was originally designed to enhance the literacy skills, including vocabulary, among children in pre-kindergarten. For the current project, the intervention is adapted for kindergarten and modified to focus primarily on vocabulary learning. Other areas of the PAVE prekindergarten program (i.e., alphabet, phonological awareness, and environmental print) are routinely covered as part of kindergarten language and literacy instruction and therefore are not included in the kindergarten professional development program.


As part of the ECEPD program, several PAVE intervention conditions were evaluated using a quasi-experimental design (Schwanenflugel, Hamilton, Bradley, Ruston, Nueharth-Pritchett & Restrepo, 2005; Schwanenflugel, Hamilton, Neuharth-Pritchett, Restrepo, Bradley, & Ruston, under review). The evaluation provides promising evidence of intervention impacts on children’s vocabulary, resulting from both the full PAVE intervention (including phonological awareness and alphabet knowledge as well as vocabulary enhancement) and a vocabulary enhancement condition (without the phonological and alphabet components). Emerging evidence of other kindergarten vocabulary programs suggests that a vocabulary intervention such as PAVE shows promise for kindergartners as well as preschoolers. In addition, the routine instructional content covered in kindergarten suggests that the vocabulary features of PAVE proposed for the current study (without the phonological awareness, alphabet, and environmental print components) are more appropriate than the full PAVE intervention used in preschool. The suggestion that vocabulary interventions may help boost kindergarteners vocabulary skills and the benefits found for PAVE vocabulary enhancement in particular make this an important study for schools in the southeast region.


The current study will extend the evidence of the effectiveness of the PAVE vocabulary program in two important ways not addressed in the previous quasi-experimental evaluation. First, the current study uses an experimental design, which offers a much stronger test of the PAVE program’s effectiveness for improving children’s vocabulary. Second, the current study will examine whether impacts of the PAVE program are sustained beyond the program year.


An overview of the study design and data collection plan is presented in section A1 below.


  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


This evaluation is being conducted by the Regional Education Laboratory – Southeast (REL-SE), located at the SERVE Center, University of North Carolina at Greensboro (SERVE), and its subcontractors: Abt Associates Inc., the University of Georgia (UGA), and Empirical Education Inc. (EEI). The Regional Educational Laboratory (REL) Program is authorized under the Education Sciences Reform Act of 2002, Part D, Section 174, (20 U.S.C. 9564) and administered by the Institute of Education Sciences' National Center for Education Evaluation and Regional Assistance. (Part D, Section 174 of the Education Sciences Reform Act of 2002 is attached in Appendix A.) The priority for the REL program is to provide policymakers and educators with expert advice, training, and technical assistance, based on the latest findings from scientifically valid research, related to meeting the requirements of the No Child Left Behind Act (Institute for Education Sciences, 2007; http://ies.ed.gov/ncee/edlabs/about/). In instances where there is insufficient scientific evidence for the effectiveness of strategies to improve learning, the RELs are charged with conducting rigorous studies of such strategies. Each of the Regional Education Laboratories is directed to conduct rigorous studies designed to address issues of high priority to the region. The studies must meet IES’ standards for field tests based on experimental designs and are intended to establish causally valid evidence of the effects of proposed policies, programs or practices on academic achievement or other related needs of the region.


Through extensive discussions conducted by the REL-SE to determine the most pressing educational needs of the region, southeastern state department reading directors and the Director of the Florida Center for Reading Research voiced widespread agreement regarding the need for a vocabulary intervention among kindergarten students in the southeast region of the United States. The highest priority was placed on a vocabulary intervention for two reasons: (1) children in the region are well behind national averages in vocabulary skills and (2) vocabulary knowledge is an essential component of literacy development that has generally been more difficult to impact than other emergent literacy skills, such as letter knowledge.


Several psychometric studies including southeastern children suggest that poor and/or African-American children from this region may have particularly low vocabulary scores, averaging about one standard deviation below the national average (Campbell, Bell, & Keith, 2001; Restrepo, Schwanenflugel, Blake, Neuharth-Pritchett, Cramer, & Ruston, 2006). Difficulties that southeastern children have with vocabulary manifest themselves as they transition from learning to read to reading to learn. Averaging over state report cards of Alabama, Florida, Georgia, Mississippi, and South Carolina, 18% of third and fourth grade children do not meet state standards in reading. By middle school, this rate increases dramatically to 32%. The trend is far worse for African-American and economically disadvantaged children in the region, of whom 41% and 40%, respectively, did not meet state standards for reading in middle school. The high poverty and low student achievement in the Mississippi Delta make the region an ideal target for this study.


A focus on vocabulary is a good place to start in providing regional access to higher reading achievement. Despite being a critical element in reading success, vocabulary is not a well-established part of kindergarten curricula or standards. In contrast, alphabet knowledge, phonological awareness, and print uses are typically part of kindergarten instruction and standards. According to teacher estimates, kindergarten classrooms in the U.S. involve approximately equal amounts of time spent on teacher-directed instruction in reading, numbers, and the alphabet (Heaviside & Farris, 1993; Guarino, et al, 2006); however, there is not much evidence that kindergarten teachers explicitly focus on vocabulary per se. Furthermore, standards that kindergarten children must meet are relatively consistent across the U.S. (Graue, 1999), typically focusing on the alphabet, phonological, and print knowledge, with vocabulary often not explicitly included.


There is very little research on programs that focus on vocabulary with kindergarten children. A few experimental studies have found short-term benefits of kindergarten vocabulary programs (Coyne, Simmons, Kame’enui, & Stoolmiller, 2004; Robins & Ehri, 1994); however, currently there is insufficient evidence for effective strategies for this age range. The PAVE program is one vocabulary program that has shown promise, but more rigorous testing is required to establish evidence of its effectiveness.


Study Design


To evaluate the effectiveness of PAVE, the study uses a cluster random assignment design, in which approximately 80 consenting schools serving predominantly low-income children (defined by the percentage of children receiving free and reduced-price meals), within 33 school districts, are randomly assigned either to the intervention or the control condition. In treatment schools, all kindergarten teachers and assistants will receive the PAVE training. Teachers in control schools will receive professional development as currently provided by their district. In each school in the sample, two consenting kindergarten teachers will be selected at random to be in the study. From each classroom in the sample, ten students with parental permission to participate will be randomly selected to be in the study.


Data Collection Plan


Data will be collected from children at the beginning of the kindergarten year (Fall 2008) as a pretest, at the end of the kindergarten year (Spring 2009) as a posttest, and at the end of the first grade year (Spring 2010) as a second posttest. Classroom and teacher data will be collected in both treatment and control conditions at the beginning of the intervention year (Fall 2008) as a pretest and at the end of the intervention year (Spring 2009) as a posttest. In addition, during the subsequent fall (Fall 2009), data will be collected on teachers in the treatment group only to examine the sustainability of the PAVE intervention. Table A.1 shows the data collection timeline and the instruments to be used at each time point.


Student Assessments. Child measures will assess the impact of PAVE on children’s vocabulary in kindergarten and first grade, as well as on their broader language and reading abilities in first grade. Assessing the vocabulary knowledge of young children cannot be accomplished with a single measure, since there are different kinds of vocabulary knowledge. Project staff will assess receptive language skills using the Peabody Picture Vocabulary Test-4 (Dunn & Dunn, 2007) and expressive language skills, using the Expressive Vocabulary Test-2 (Williams, 2007), two nationally-normed, standardized and commonly used assessment instruments. (Examples of items from the PPVT-4 and EVT-2 are attached in Appendices B and C, respectively.) Children’s productive use of vocabulary will also be examined through a 10-minute task in which they are asked to tell a story from a wordless picture book. Specifically, children’s lexical diversity, or the number of unique words relative to the total number of words spoken, will also be measured. Because of the resource intensive nature of this type of data collection, samples will be collected from just four students with parental permission per participating classroom, and the four students will be selected randomly from the total pool of participating students in each classroom. (The administration protocol for the Lexical Diversity measure is attached in Appendix D.) Children’s decoding skills and reading comprehension will be assessed in first grade (Spring 2010), using the Woodcock Johnson Reading Mastery Test-Revised/Normative Update (WRMT-R/NU, Woodcock, 1998)1 a standardized and normed measure of reading achievement.


Table A.1

Data Collection Schedule



SY 2008-2009

SY 2009-2010


Fall 2008

Spring 2009

Fall 2009

Spring 2010

Student Measures

Pre

(Weeks

4-7)

Post

(Weeks

23-25)


Follow-up

Direct Assessments

Peabody Picture Vocabulary Test-4

X

X


X

Expressive Vocabulary Test-2

X

X


X

Lexical Diversity


X


X

Woodcock Reading Mastery Test-R/NU




X

Child Data from Extant School District Sources

Demographic data

X




Special education information

X

X

X


Teacher/Classroom Measures

Pre

(Weeks

5-8)

Post

(Weeks

21-25)

Follow-up

(Weeks 5-8)


Questionnaires/Interviews





Teacher Demographics Questionnaire

X




Implementation Challenges Interview



X


Observation Measures





Audiotape Recorded Literacy Lesson

X

X



Classroom observation

X

X



Fidelity assessment



X



Classroom and Teacher Measures. A time sampling observational instrument designed for this study will document the vocabulary and literacy instructional practices in both treatment and control classrooms (See Appendix E). During the classroom observation visit, treatment and control teachers will be audiotape recorded during a 20-minute, small group instructional period focusing on literacy to examine the lexical diversity of teachers’ language directed to students in the classroom. (The protocol for recording the literacy lesson is attached in Appendix F.) In addition, the overall quality of teachers’ talk on recorded samples will be rated using the Teacher Interaction and Language Rating Scale (see Appendix G; Girolametto & Weitzman, 2002).


A fidelity assessment will be conducted in the fall of the school year following the intervention, to determine if teachers in the treatment group sustain the implementation of the PAVE intervention as it was intended. Following the fidelity assessment, data collectors will interview teachers in the treatment group about components of the PAVE intervention that they find difficult or challenging to implement. The fidelity assessment tool and the implementation challenges interview are attached in Appendices H and I, respectively.


Demographic information. Extant data on each child will be collected from computerized school records, including information on age, gender, race, ethnicity, eligibility for free or reduced-price school meals, special education status, and status as an English-language learner (The data collection form for gathering extant data is attached in Appendix L). Teachers and paraprofessionals in both the treatment and control groups will complete a demographic questionnaire requesting information on their gender, race/ethnicity, educational background, number of years teaching, and number of years teaching kindergarten, among other variables (See Appendices J and K, respectively). Demographic information about schools, collected from the Mississippi Assessment and Accountability Reporting System (MAARS), a searchable online database on the Mississippi Department of Education website http://orsap.mde.k12.ms.us:8080/MAARS/indexProcessor.jsp), will include student enrollment, class sizes, students’ racial and ethnic composition, percent of students receiving free and reduced-price meals, and state proficiency test scores.


  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The data will be used by IES and by REL-SE to determine the effectiveness of the PAVE vocabulary intervention for improving students’ vocabulary and reading achievement and improving kindergarten teachers’ vocabulary instructional practices. The REL-SE will use information about the extent to which the PAVE intervention is effective to provide advice, training, and technical assistance to regional policymakers, educational administrators, and educators. Findings will inform decisions about expansion of the PAVE program to other kindergartens in the southeastern United States and beyond. Without information about whether or not the program is effective, decisions might be made to expand a program that is not effective or not to expand a program that is effective.


Information collected from teacher interviews, classroom observations, and audiotape recordings of a literacy lesson will be used to evaluate the effectiveness of the PAVE vocabulary intervention to improve teachers’ vocabulary and broader literacy instructional practices. Improving teachers’ instructional practices is an intermediate goal of the PAVE intervention, as it is hypothesized that the intervention will improve children’s outcomes through changes in teachers’ practices. The presence of more vocabulary enhancing instructional practices in treatment classrooms compared to control classrooms will provide strong evidence of the intervention’s effectiveness on teaching practice. In addition, data on teachers’ fidelity to PAVE instructional practices in Fall 2009, the school year following the intervention year, will indicate whether teacher who have received the PAVE training sustain their use of the practices into the next school year.


Information collected from direct child assessments (i.e., PPVT-4; EVT-2; lexical diversity measure; WRMT-R/NU) will be used to evaluate whether having the PAVE vocabulary intervention during kindergarten is effective for improving students’ vocabulary and oral language skills in kindergarten and first grade, as well as improving their broader literacy development in first grade. Higher levels of vocabulary development and reading comprehension among students that received the intervention compared to control group students that did not receive PAVE will provide strong evidence of the program’s effectiveness.


Information on teacher and student background and demographics will be used as covariates in statistical analyses.


  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Where feasible, information will be gathered from existing data sources, such as school records, rather than imposing additional burden by collecting primary data. School records information (i.e., demographic information about students and schools) will be gathered via computer files. The Child Data Extraction Form shown in Appendix L indicates the specific data on students that we are requesting and provides a convenient way for school systems to provide the information. However, we anticipate that most school systems will prefer to transmit the data electronically, and we will provide a process by which schools or districts may submit records electronically. A key consideration in collecting demographic information on students from school administrative records is to minimize respondent burden. Accessing data from school records eliminates the need to interview parents about demographic information, as well as reduces the amount of information requested from teachers.


In many cases, however, data can only be obtained directly from students or teachers. Additional information about teachers’ educational and training background that is not included in school records databases must be gathered though a brief, pen-and-paper hardcopy teacher questionnaire. This hardcopy mode will be used for two reasons. First, this mode presents the simplest, least burdensome method of collection for the teachers, as they will not need access to a computer to complete the questionnaire. Second, we anticipate that the hardcopy mode, rather than an electronic or web mode, will facilitate a higher response rate, as we will collect the questionnaires at the same session or classroom visit when they are distributed. Teachers in the treatment condition will complete the questionnaire when they attend the summer PAVE training. Questionnaires will be distributed to teachers and completed questionnaires will be collected before teachers leave for the day. Teachers in the control condition will complete the pen-and-paper hardcopy questionnaire on the day of the classroom observation. The classroom observer will collect the completed questionnaire from the teacher before leaving the classroom.


Information about the challenges teachers encounter when implementing the PAVE intervention can only be collected through interviews with teachers in the treatment group. In addition, information about teachers’ instructional practices can only be obtained through classroom observations. Furthermore, information about students’ vocabulary, oral language, and reading skills can only be obtained through direct assessment of students.


In order to gather information about the language used by teachers to foster children’s vocabulary development, as well as the language that students use, we must collect oral language samples. Because of the detailed and nature of analyzing oral language samples, it is necessary to audiotape record language samples from teachers and students. When recording teachers, we will use small wireless microphones to minimize any intrusiveness.


The findings of this study and recommended refinements to the intervention will be summarized in reports prepared for IES under this contract and in articles submitted to professional journals. Study results will be disseminated to the National Laboratory Network website, ERIC, and the What Works Clearinghouse.


  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


Although there has been a previous study of the PAVE intervention, the program has not been evaluated using a rigorous randomized design. Randomized controlled trials provide the highest standard of causal evidence about the effectiveness of programs. The previous evaluation of PAVE employed a quasi-experimental design, in which two counties volunteered for participation in the intervention and one neighboring county was selected for comparison based on its demographic similarity to counties in the treatment group. Positive findings from the previous study indicate that PAVE has promise for improving children’s vocabulary development; however, the quasi-experimental study does not offer sufficient evidence to conclude that PAVE is effective. Teachers who volunteered for the intervention may have differed from teachers in the comparison group in unobserved ways. Consequently, there is no way to conclude with certainty that the positive findings resulted from the PAVE intervention per se. A randomized design requires that schools volunteer to participate without knowledge of whether or not they will be assigned to receive the PAVE program, which is the plan for the current study. Schools will be randomly assigned either to a treatment group that receives PAVE program or to a control group that does not receive the PAVE training until the study is completed.


There are two additional reasons why the current data collection does not duplicate existing information. First, the previous quasi-experimental evaluation examined the effects of implementing PAVE in prekindergarten classrooms, while the current study will examine the impact of the PAVE vocabulary intervention in kindergarten classrooms. Second, previously the PAVE intervention did not examine effects of PAVE beyond the intervention year. The current study, however, will examine not only immediate impacts on students at the end of the intervention year but also whether impacts are sustained the following year, in first grade. Learning whether impacts are sustained will provide important information about whether to implement PAVE on a broader scale or explore intervention enhancements for achieving longer-lasting impacts.


We will use existing data for the study whenever possible, rather than duplicating data collection efforts. As already noted, we will use data on district and school demographics already compiled by the Mississippi Department of Education and available on the Internet, rather than collecting the same data from school administrators. Similarly, we will use data on student demographics already collected and kept in school administrative records, rather than collecting the same information from parents.


In addition, we explored the possibility of using data from state proficiency tests instead of administering additional student assessments. However, students in Mississippi do not take state proficiency tests until third grade, which is beyond the end of the contract period for this study. Consequently, the information to be collected from student assessments will not be available elsewhere. Data gathered from classroom observations, teacher questionnaires, and teacher interviews also will not be available elsewhere.


  1. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods to minimize burden.


The primary entities for this study are schools and school districts. Burden is minimized for all respondents by requesting only the minimum information required to achieve the study objectives. All data collection will be coordinated by the evaluation contractors: SERVE Center at the University of North Carolina-Greensboro (SERVE), and its subcontractors, Abt Associates Inc., University of Georgia (UGA), and Empirical Education Inc. Evaluation contractors will carefully specify information needs; questions to schools and districts will be restricted to generally available information maintained in school administrative records. We anticipate that school and/or district personnel will be able to retrieve and transfer electronic data with minimal burden and with support from evaluation contractors.


  1. Describe any consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


If the proposed data were not collected, REL-SE and IES would not be able to evaluate the effectiveness of the PAVE kindergarten vocabulary intervention. As noted above, school administrators and reading experts agree that there is a great regional need for an effective vocabulary intervention in kindergarten. Identifying promising strategies for improving vocabulary learning and conducting a rigorous study of their effectiveness are of the highest priority to the REL-SE. Building the causal evidence base that the proposed data collection will enable is essential for the REL-SE to provide informed and justified expert advice, training and technical assistance to educational entities in the Southeast region.


Collecting data less frequently would not be sufficient for determining if the PAVE vocabulary intervention has an impact on teachers’ instructional practices and/or children’s vocabulary development. Collecting information on teachers’ instructional practices and children’s vocabulary development at the beginning (baseline) and end (posttest) of the intervention year is critical for evaluating the effectiveness of PAVE.


Post-test data is necessary for comparing treatment and control conditions at the end of the intervention. Post-test differences favoring the treatment group provide evidence of the intervention’s effectiveness.


While the experimental design in this study allows us to identify intervention impacts based on differences between the treatment and control group measures at the end of the intervention year, baseline information collected at the beginning of the intervention year allows us to measure this difference more precisely. That is, baseline measures used as covariates in our analytic models will allow our estimates of program impact to more accurately reflect those due to the PAVE intervention. Furthermore, the baseline information collection proposed for the fall of the intervention year will enable us to identify any differences between the treatment and control groups that are present prior to the intervention. We must know about any baseline differences between the groups to ensure that post-test differences can be attributed to the intervention rather than baseline differences.


Data collection is proposed beyond the intervention year to examine whether positive impacts of PAVE are sustained. The ultimate goal of the intervention is not only to improve children’s vocabulary during kindergarten (and teachers’ instructional practices during the intervention year) but also to produce lasting gains. It is because of an anticipated relationship between early skills and later skills that we plan to test for longer-term impacts. However, our investigation of sustained impacts will be contingent on finding prior impacts. We do not anticipate “sleeper effects,” whereby later impacts emerge despite no immediate impact. If we do not find impacts at the end of the intervention year, we will not continue with subsequent data collection.


  1. Explain any special circumstances of information collection.


This request fully complies with the following regulations. Information collection will NOT be conducted in a manner:


  • requiring respondents to report information to the agency more often than quarterly;

  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • requiring respondents to submit more than an original and two copies of any document;

  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

  • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • requiring the use of statistical classification that has not been reviewed and approved by OMB;

  • that includes a pledge of confidentiality that is not supported by authority established in a statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.


  1. Summarize solicitation of public comments and consultation with people outside the agency.


    1. Federal Register Announcement


A 60-day notice to solicit public comments was published in the Federal Register on June 21, 2007 (Volume 72, Number 119, pages 34234-34235). The Federal Register Announcement is attached in Appendix M.


The following comments were received during the comment period and addressed as indicated. [Summarize comments and actions taken in response.]

OR

No comments were received during the comment period.


    1. Consultations Outside the Agency


Consultations with experts on large-scale random assignment studies, impact evaluation, vocabulary and literacy development, and vocabulary and literacy intervention programs have occurred throughout the design phase of this study and will continue to take place throughout the study.

Senior technical and substantive staff from the evaluation contractors is listed below:


SERVE Center Ludwig van Broekhuizen


Abt Associates Inc. Stephen Bell

Howard Rolston

Barbara Goodson


University of Georgia Paula Schwanenflugel

Stacey Neuharth-Pritchett


We will work in collaboration with personnel from the Mississippi Department of Education, including:


Dr. Hank Bounds, Mississippi Superintendent of Education

Beth Sewell, Executive to the Superintendent for Instructional Programs and Services

Robin Miles, Director of the Office of Reading, Early Childhood, and Language Arts


In addition, a technical work group (TWG) has been formed to provide advice on the study. Members of the technical work group include:


Michael Coyne University of Connecticut

Michael Kamil Stanford University

Catherine Snow Harvard University


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


In an effort to offer teachers an incentive to participate, we will meet with members of the Mississippi State Department of Education to discuss what can be offered to teachers as an incentive, including professional development credits for completion of the PAVE training. The possibility of providing monetary payments to compensate schools and teachers for the time it takes to provide information and accommodate our needs throughout the evaluation will also be explored. Specifically, as part of the PAVE professional development, teachers will be required to: (1) participate in the one-day PAVE training in the summer prior to the intervention year and (2) participate in after-school peer discussions with other teachers in the same school for one hour every four to six weeks during the intervention period. Over the course of the intervention, teachers will be expected to participate in a total of four to six peer discussions. As indicated in Table 3 (in Section A.16), the hourly rate for teachers in Mississippi is estimated to be $22.31. Based on this rate, we would pay each teacher $200 for attending the PAVE training and $150 after completion of all the peer discussions. These monetary incentives would be provided only to teachers in the treatment group, to compensate them for the time required to attend PAVE training and other meetings outside of the school day. Teachers in the control group are not required to devote time to attend training or meetings and thus would not receive monetary compensation. No teachers, in either condition, will receive compensation for any data collection demands of completing questionnaires or interviews.


Teachers will be provided with intervention materials for their classrooms, including books and curriculum units with intervention-related activities, which intervention developers estimate to be worth $750. The materials will be provided to teachers in the treatment group for the intervention year and to teachers in the control group, along with PAVE training, at the end of the intervention year.


We will also compensate each school’s study liaison, an administrative staff person in each participating school designated by the school to assist study staff in sending out recruitment information and obtaining signatures on the parental permission forms. We would work with the school’s study liaison to get the parental permission forms to parents as soon as possible once the school year begins, and to follow up with those parents not responding. The liaison will collect the returned consents and follow up persistently with those parents who have not yet returned the consent forms. We anticipate the liaison spending approximately 8 hours on this task, with some time before the school year starts and after the school day once the year starts. Based on the salary schedule for administrative staff posted on the Jackson, MS school district website (http://www.jackson.k12.ms.us/departments/human_resources/salaryscales/admin_scale.pdf), we estimate the hourly rate for the school’s study liaison to be approximately $22.60/hour2. For the 8 hours to help obtain signed parental permission forms, we would like to compensate study liaisons $180.


We will also explore providing children with a small token (e.g., a book) after each data collection session.


  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statue, regulation, or agency policy.


The SERVE Center at UNC-Greensboro, Abt Associates Inc., the University of Georgia, and Empirical Education Inc. follows the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Title I, Part E, Section 183 of the Education Services Reform Act requires that all collection, maintenance, use and wide dissemination of data conform to the requirements of the Privacy Act of 1974 (5 USC 552a), the Family Educational Rights and Privacy Act of 1974 (20 USC 1232g), and the Protection of Pupil Rights Amendment (20 USC 1232h). Respondents will be assured that all individually identifiable information about themselves and the individual schools shall remain confidential (See Appendices N – Q, respectively, for Teacher Consent Form, Parental Permission Form, School Agreement, and School District Agreement.)


The Privacy Act of 1974 applies to this collection. A Notice for a New System of Records will be prepared for submission to the Federal Register.


The SERVE Center at UNC-Greensboro, Abt Associates Inc., the University of Georgia (UGA), and Empirical Education Inc. will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released. Information from participating institutions and respondents will be presented at aggregate levels in reports. Information on respondents will be linked to their institution but not to any individually identifiable information. No individually identifiable information will be maintained by the study team. All institution-level identifiable information will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required. SERVE obtains signed NCEE Affidavits of Nondisclosure from all employees, subcontractors, and consultants that may have access to this data and submits them to our NCEE COR. All members of the study team having access to the institution-level data have been certified by the Institutional Review Board at UNC-Greensboro, Abt Associates, or UGA as having received training in the importance of confidentiality and data security.


Specifically, data that is collected will not be shared with anyone outside of the research organizations. Within the organizations, the information will only be shared with researchers who have completed certificates of confidentiality (See Appendix R). Physical data will be stored in locked file cabinets. Computers containing data files will have password security so that only the assigned personnel are able to access the data. No written document coming out of the research organizations will include information that will make it possible to identify individual participants, their schools, or school districts.


In all research organizations, identifiable data will be kept for a maximum of four years and then destroyed. Written records will be shredded, audiotape recordings will be destroyed, and electronic records will be purged.


  1. Provide additional justification for any questions of a sensitive nature, including matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps taken to obtain their consent.


Some data on students that we propose to collect from school administrative records may be considered sensitive, specifically information on eligibility for free or reduced-price school meals and special education status. Because vocabulary, oral language, and literacy development are associated with socioeconomic status and with participation in special education, it is essential to control statistically for these variables when examining the impact of the PAVE intervention on students’ vocabulary, oral language, and literacy outcomes. Furthermore, having this information will enable us to investigate whether the PAVE vocabulary intervention has larger (or smaller) impacts on students who receive free or reduced-price meals or who receive special education compared to students who do not. As noted above, student demographic information will be kept strictly confidential.


Student performance on vocabulary, oral language, and literacy assessments may also be considered sensitive. However, we cannot evaluate the impact of the PAVE intervention on students’ vocabulary, oral language, and literacy development without conducting these assessments.


Other demographic information about students that will be collected from school administrative records is not considered sensitive. This information includes: age, gender, race/ethnicity, and status as an English-language learner.


The Parent Permission Form (See Appendix O) informs parents, if they choose to give permission, that their child’s language skills will be assessed in order to help researchers learn what children need to become good readers. In addition, the form states that, if parents give their permission, information about their child’s special education, school lunch status, and other demographic information will be collected from school files, in order to be sure that all kinds of children can benefit from the instruction. The Parent Permission Form states that all data will be kept confidential and that participation is voluntary.


Teachers will not be asked sensitive questions. In the teacher questionnaire, teachers will be asked to provide information about gender, race/ethnicity, educational background, number of years teaching, and number of years teaching kindergarten (See Appendices J and K). For the Implementation Challenges Interview, teachers in the treatment group will be asked to describe any difficulties implementing the PAVE program (See Appendix I).


This study has received approval from three Institutional Review Boards (IRB): University of North Carolina-Greensboro; Abt Associates Inc.; and University of Georgia. Copies of the approved IRB clearance forms are attached in Appendix S.


  1. Provide estimates of the hour burden of the collection of information.


Table A.3

Respondent Burden Estimates


Informant/ Instrument

Number of Respondents

Number of Rounds


Number of Responses

Average Time per Response (Hours)

Total Respondent Time (Hours)

Estimated Hourly Cost to Respondent (Dollars)

Estimated Total Cost (Dollars)

Students

1600


12480


3520


$0

PPVT

1600

3

4800

.25

1200

$0

$0

EVT

1600

3

4800

.25

1200

$0

$0

Lexical Diversity

640

2

1280

.25

320

$0

$0

WRMT-R/NU3

1600

1

1600

.50

800

$0

$0

Teachers

160


240


53.4


$1191.36

Teacher Demographic Questionnaire

160

1

160

.17

26.7

$22.31

$595.68

Implementation Interview

80

1

80

.33

26.7

$22.31

$595.68

Other

161


163


50.7


$783.33

Paraprofessional Demographic Questionnaire

160

1


160

.17

26.7

$10.62

$283.55

School System Administrative Data (Child Data File Extraction Form)

1

3



3

8

24

$12.50

$300

TOTAL

1,921


12,883


3,624



Note. Total number of respondents: 1,921.

Total annual responses: 12,883 (total number of responses) divided by 3 years of data collection equals 4,294.

Total annual hours requested: 3,624 (total respondent time in hours) divided by 3 years of data collection equals 1,208.


Table A.3 shows the total number of respondents is 1, 921. The total reporting burden associated with this data collection is 3,624 burden hours, consisting of 3520 hours for students, 53.4 hours for teachers, 27 for paraprofessional aides, and 24 for school system support staff. The data collection burden for respondents is divided over three rounds of data collection over three years (representing two school years). The total number of annual responses is 4,294. This represents a total annual response burden of 1,208 hours. Less than one percent of the data (the child data from the school system files) can be collected electronically.


Estimated hourly costs to teachers are based on teacher salary tables posted on the National Education Association website. The average annual salary of a teacher in Mississippi is $37,924 (http://www.nea.org/student-program/about/state2.html#mississippi). The hourly rate of $22.31 was calculated by assuming that teachers work 40 hours/week during 10 months/year (and that a month is 4.25 weeks).


Estimated hourly costs to paraprofessional educators are based on U.S. average annual salary for paraeducational support personnel posted on the National Education Association website (http://www.nea.org/pay/espsalaries.html), which is $18,052. The hourly rate of $10.62 was calculated by assuming that paraprofessionals work 40 hours/week during 10 months/year (and that a month is 4.25 weeks).


Estimated hourly costs for gathering school administrative data assume that it will take one clerical or support staff person 24 hours to provide the data. Information about support staff salaries is based on the salary schedule for clerical and support staff posted on the Jackson, MS school district website (http://www.jackson.k12.ms.us/departments/human_resources/salaryscales/support_scale.pdf). Salaries for clerical and support staff range from $5.00/hour to $27.00/hour, reflecting both paygrade and years of experience. The estimated hourly rate of $12.50 corresponds to the approximate midpoint on the paygrade scale with 7 years of experience or a higher paygrade and 3 years of experience.


  1. Provide an estimate for the total annual cost burden to respondents or record-keepers resulting from the collection of information.


There are no direct costs to participants.


  1. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, & 14 in a single table.


The estimated cost to the federal government for conducting the randomized control trial (RCT) of The Effectiveness of a Program to Accelerate Vocabulary Development in Kindergarten —including designing the study, recruiting schools, implementing random assignment, collecting school/teacher/student data, processing and analyzing the data, and preparing reports summarizing the results—is $3,043,470. The cost of the data collection activities associated with this project is projected to be $1,078.012. The study period is from March 2006 to March 2011, with data collection taking place from July 2008 to August 2010.


Appendix T shows the method used for estimating costs to the federal government and includes the quantification of hours, operational expenses, and other expenses projected to support the recruitment and data collection efforts of the Vocab study.


  1. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.


This is a new study.


  1. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Publication Plans


This study will produce two sets of reports. The first is a technical and a non-technical report scheduled in Year 4 of the study (in 2010) after the intervention year has been completed, and the second is a technical and non-technical report scheduled in Year 5 of the study (in 2011), once all follow-up data collection and analyses are complete. The project team will prepare articles for journal submission following IES approval of the final set of reports.


Before the REL-SE submits data to IES for publication on the IES website, the REL-SE will be responsible for conducting a deductive disclosure analysis to ensure that readers of the agency's (IES) web page cannot deduce the identity of individual schools, teachers, or children.


Time Schedule


Table A.4 shows the full timeline for the evaluation. Design activities, preparation of intervention materials, and preparation of evaluation instruments occur from March 2006 to September 2007. Recruitment activities and random assignment occur from September 2007 to Spring 2008. The intervention will be implemented during the 2008-2009 school year, beginning with the summer training in 2008. An interim report will be released in Spring 2010. Follow-up data collection will take place during the 2009-2010 school year, with release of a final report by the end of 2010 and publication of journal articles and dissemination materials in Winter 2011.


Tabulation Plans


The analytic plan focuses on the two main questions: (1) the impact of the PAVE vocabulary intervention on students’ vocabulary and literacy outcomes and (2) the impact of PAVE on teachers’ vocabulary and broader literacy instructional practices.

Impacts of PAVE on Students. To examine the impact of PAVE on students at the end of kindergarten, we will use a hierarchical linear model (HLM), which provides us with an estimate of the average impact of the intervention on children across all schools at a given time point (e.g., at the end of the kindergarten year). The HLM is particularly appropriate for this evaluation since we have a multilevel design with students nested within classrooms and schools4. Students in our evaluation are clustered within schools, and the treatment occurs for all kindergarten students within a school. The HLM model enables us to adjust standard error estimates to account for the nesting of students in classrooms. This adjustment is particularly salient because the unit of random assignment is the school not the student. In addition, HLM allows us to determine what proportion of the total variation in student outcomes occurs at the school level and what proportion occurs at the student-level. HLM also allows us to use both school-level and student-level covariates to account for variation in student outcomes. Using a school-level variable to indicate treatment status (i.e., whether the school was assigned to the PAVE treatment or the control group) in Level-2 model will enable us to determine whether there is a significant impact of the PAVE treatment on the specified student outcome. A positive and statistically significant parameter estimate will indicate that the PAVE intervention does impact student vocabulary (or broader literacy) outcomes. The magnitude of the parameter estimate will indicate the estimated magnitude of the impact, i.e., participation in PAVE is associated with an estimated point difference in scores (or standard deviation difference) of students in PAVE schools compared to students in non-treatment schools.


Table A.4

Schedule of Study Activities


Activity

Schedule

Study design

March 2006 – March 2007

Preparation of materials

January 2007 – September 2007

Pilot

May 2007

Obtain state and district support

April 2007 – September 2007

Recruitment of school districts, schools, and teachers

September 2007 – March 2008

Random assignment

Spring 2008

PAVE professional development training

July 2008

Parental consent and selection of child sample

August - September 2008

Baseline data collection from teachers & students

Fall (September – October) 2008

Post-test data collection

February – April 2009

Interim reports

Summer 2009 – Spring 2010

PAVE training for control group

July 2009

Follow-up data collection from treatment teachers

Fall (September - October) 2009

Follow-up data collection from students

February – April 2010

Final reports

Summer – Fall 2010

Journal articles, dissemination materials

Winter 2011


In addition to examining the impact of PAVE on students overall, we will examine its impact on subgroups of students, such as boys and girls. By analyzing subgroups of students, we can determine if there are differential effects of the PAVE intervention for certain subsets of students. Specifically we may be interested in knowing whether the effects of PAVE systematically differ for boys and girls. To address this question we will take one of two approaches. One approach would be to include an interaction of the treatment effect with a subgroup variable in the HLM model (e.g., treatment*BOY, where BOY = 1 for boys and 0 for girls. The estimated parameter of such an interaction would indicate if there were additional effects of PAVE for boys or girls. This is a conventional approach; however, it would require the assumption that the variance in outcome scores is constant for boys and girls and that gender is not correlated with the marginal effects of the other covariates in the model, which may or may not be a tenable assumption. The second approach would be to break the entire sample into subgroups, (e.g., one sample entirely comprised of boys, the other of girls), and estimate the treatment effect for each of the two samples. Once these impacts are estimated, we can compare the means and variances of the estimates of the subgroups to determine if there are differential impacts between these two groups. Differences in impact between the two groups (i.e., the interaction effect) will then be calculated via a t-test. This is also a sensible approach; however, the smaller sample sizes for each subgroup inherently make impact estimates for each group less precise.


Changes in Impacts on Students over Time. With the addition of another school year of data (i.e., first grade), we will extend the cross-sectional model to look at changes in students over time, using longitudinal linear growth modeling. The linear growth model is hierarchical in the sense that multiple observations are nested within individual students who are, in turn, nested within classroom/schools. We will estimate a three-level hierarchical linear growth model. Level-1 will represent each student’s development in the form of an individual linear growth trajectory, whose parameters then become the outcome variables in the between-student level (Level-2) of the model. The individual growth parameters willbe modeled as a function of student background characteristics (e.g., sex, ethnicity, free/reduced school lunch eligibility). The impact of the PAVE treatment on average student growth (e.g., in vocabulary development) will be tested in a school-level model. In the school-level model (Level-3), school mean growth parameters will be modeled as a function of PAVE treatment status and other school characteristics. Using a treatment status indicator variable in the school-level model, we will estimate whether there is a significant impact of the PAVE treatment on average linear growth and on average student score at a specified time point. A positive and statistically significant parameter estimate would indicate that the PAVE intervention does impact average growth in student vocabulary (or broader literacy) outcomes. The magnitude the parameter estimate will indicate the estimated magnitude of the impact of PAVE participation on average student growth.


Impacts of PAVE on Teachers. The impact of the PAVE intervention on teacher and classroom practices, controlling for teacher and school characteristics, will be estimated using a multilevel model, in order to account for the clustering of two teachers per school. The model will include a teacher-level (Level-1) and a school-level (Level-2). Because of the limited degrees of freedom at Level-1 (due to sampling only two teachers per school), we will control for teacher characteristics at the school-level. For each teacher characteristic, we will calculate the average value for the school.


We will include a school-level indicator of treatment status in the Level-2 model in order to estimate whether there is any significant impact of the PAVE treatment on a specified teacher outcome. Thus, a positive and significant parameter estimate would indicate that the PAVE intervention does influence how teachers conduct classroom activity and implement program features.


  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The expiration date for OMB approval will be displayed.


  1. Explain each exception to the certification statement identified in Item 19 “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.


No exceptions to the certification statement are requested or required.


References


Campbell, J. M., Bell, S. K., & Keith, L. K. (2001). Concurrent validity of the Peabody Picture Vocabulary Test-Third Edition as an intelligence and achievement screener for low SES African American children. Assessment, 8(1), 85-94.


Coyne, M. D., Simmons, D. C., Kame'enui, E. J., & Stoolmiller, M. (2004). Teaching vocabulary during shared storybook readings: An examination of differential effects. Except ionality, 12, 145-162.


Dunn, L. M., & Dunn, L. M. (2007). Peabody Picture Vocabulary Test-Fourth Edition (PPVT-4). Bloomington, MN: Pearson Assessments.


Girolametto, L., & Weitzman, E. (2002). Responsiveness of child care providers in interactions with toddlers and preschoolers. Language, Speech, and Hearing Services in the Schools, 33, 268-281.


Graue, E. (1999). Diverse perspectives in kindergarten contexts and practices. In R. C. Pianta & M. J. Cox (Eds). The transition to kindergarten. Baltimore: Paul H. Brookes Publishing.


Guarino, C. M., Hamilton, L. S., Lockwood, J. R., & Rathburn, A. H. (2006). Teacher qualifications, instructional practices, and reading and mathematics gains of kindergarteners (NCES 2006-031). U.S. Department of Education. Washington, DC: National Center for Educational Statistics.


Heaviside, S., & Farris, E. (1993). Public school kindergarten teachers' views on children's readiness for school (NCES 93-410). Washington, DC: U.S. Department of Education, National Center for Educational Statistics.


Restrepo, M. A., Schwanenflugel, P. J., Blake, J., Neuharth-Pritchett, S., Cramer, S., & Ruston, H. (2006). Performance on the PPVT-III and the EVT: Applicability of the measures with African-American and European-American Preschool children. Language, Hearing, and Speech Services in the Schools, 37, 17-27.


Robbins, C., & Ehri , L.C. (1994). Reading storybooks to kindergartners helps them learn new vocabulary words. Journal of Educational Psychology, 86(1), 54-64.


Schwanenflugel, P. J., Hamilton, C. E., Bradley, B. A., Ruston, H. P., Neuharth-Pritchett, S., & Restrepo, M. A. (2005). Classroom practices for vocabulary enhancement in prekindergarten: Lessons from PAVEd for Success. In E. H. Hiebert & M. L. Kamil (Eds.), Teaching and learning vocabulary: Bringing research to practice (pp. 155-178). Mahwah, NJ: Lawrence Erlbaum Associates.





Schwanenflugel, P. J., Hamilton, C. E., Neuharth-Pritchett, S., Restrepo, M. A., Bradley, B. A., & Ruston, H. P. (under review). PAVEd for Success: An evaluation of a comprehensive preliteracy program for 4-year-old children. Athens, Georgia, UGA, Manuscript submitted for publication.


Smith, J., Brooks-Gunn, J., & Klebanov, P. (1997). Consequences of living in poverty for young children's cognitive and verbal ability and early school achievement. In G. Duncan & J. Books-Gunn (Eds.), Consequences of growing up poor (pp. 132-189). NY: Russell Sage Foundation.


Storch, S. A., Whitehurst, G. J. (2002). Oral language and code-related precursors to reading: Evidence from a longitudinal structural model. Developmental Psychology, 38(6), 934-947.


Tabors, P. O., Snow, C. E., & Dickinson, D. K. (2001). Homes and schools together: Supporting language and literacy development (pp. 313-334). In Dickinson, D.K., & Tabors, P.O. (Eds.)., Beginning literacy with language: Young children learning at home and school. Baltimore, MD: Paul H. Brookes Publishing Co.


Williams, K. T. (2007). Expressive Vocabulary Test-Second Edition (EVT-2). Bloomington, MN: Pearson Assessments.


Woodcock, R. M. (1998). Woodcock Reading Mastery Test-Revised/Normative Update (WRMT-R/NU). Bloomington, MN: Pearson Assessments.



1 The Woodcock Reading Mastery Test-R/NY is a copyright protected instrument, thus it cannot be included as an appendix to this document.

2 According to the salary schedule, administrative staff salaries vary depending on job role and years of experience. Based on three to five years of experience for job titles such as Coordinator I, Supervisor II, Assessment Specialist, and Research Analyst, an approximate average annual salary is $38,440. The hourly rate was calculated based on the assumption that administrative staff work 40 hours/week during 10 months/year (and that a month is 4.25 weeks).

3 The Woodcock Reading Mastery Test-R/NY.

4 There is no classroom-level equation in the model due to our earlier assertion that we do not expect much variance in student achievement between kindergarten classrooms within a given school.

File Typeapplication/msword
File TitleOMB Submission
AuthorAbt
Last Modified BySheila.Carey
File Modified2007-08-31
File Created2007-08-31

© 2024 OMB.report | Privacy Policy