11-19-07 memo

memo 11 19 07 Vocab OMB (2).doc

The Effectiveness of a Program to Accelerate Vocabulary Development in Kindergarten

11-19-07 memo

OMB: 1850-0846

Document [doc]
Download: doc | pdf



Regional Education Laboratory-Southeast at SERVE Center



Date

November 19, 2007



To

Karen Matsuoka, OMB

Gil Garcia, IES

Amy Feldman, IES



From

REL-SE Vocabulary Study Team



Subject

OMB Questions on 200708-1850-007: The Effectiveness of a Program to Accelerate Vocabulary Development in Kindergarten



The following are responses to the questions raised by OMB on the supporting statement for the Evaluation of the Effectiveness of a Program to Accelerate Vocabulary Development in Kindergarten (Vocabulary Study). We address each question separately below, and have revised Parts A and B of the supporting statement accordingly (see attached).


  1. Is ED requesting an exception to the incentive policy? OMB would like references to “the possibility of providing monetary payments to compensate schools and teachers….” removed. Also there should be no payments (except PD) to treatment teachers.


ED is not requesting an exception to the incentive policy. All references to “providing monetary payments to compensate schools and teachers” have been removed from the response to questions A9 and B3. Any statements referring to offering incentives were included in error. The statements appeared in an earlier draft; however, the version we intended to submit to OMB for review had the statements removed.


  1. Why do you need to pay a school employee to function as a liaison? Isn’t that person paid by the school? Are you asking them to work after hours? If so, why?


As above, we included statements from an earlier draft that referred to paying the school employee for acting as a study liaison in error. We have removed any reference to doing so from the responses to A9 and B3.


  1. Please ensure that the standard PRA blurb is on the materials.


The following has been placed on the teacher and paraprofessional demographic questionnaire, and the child extant data form:






OMB Clearance Number: xxxx-xxxx Expiration Date: xx/xx/xxxx


Estimates of Burden for the Collection of Information.

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this collection is xxxx-xxxx. The time required to complete this information collection is estimated to average xx minutes per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collected.



  1. Please revise the race/ethnicity question to comply with OMB standards.


We have revised the question in the Teacher Demographic Questionnaire and the Paraprofessional Demographic Questionnaire to comply with OMB standards. The revised questionnaires are included in the appendices of Parts A and B of the Supporting Statement.


  1. Please cite the confidentiality statute (ESRA) directly when pledging confidentiality on the various forms.


The teacher consent form, parent permission form, school district agreement, and school agreement have been edited to ensure that they include the following pledge of confidentiality that explicitly references the ESRA statute:


All information from this study will be kept confidential as required by the Education Sciences Reform Act of 2002 (Title I, Part E, Section 183). To ensure privacy, identification numbers will be used on the forms rather than names. All of the information that is collected will be stored separately from school records in a secure location and will be destroyed three years after the project ends. Written records will be shredded, and electronic files will be purged.


Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. We will not provide information that identifies you, your school, or your district to anyone outside the study team, except as required by law.


  1. Parental consent form – It is not acceptable to cite the confidentiality of the ESRA and to use its associated pledge (as stated on form) and then indicate that assessment results will be shared with the child’s teacher. Please eliminate that statement from the consent form and provide assurances that the researchers will not violate confidentiality in that manner.


We have eliminated the statement referring to sharing assessment results of very low performers with the child’s teacher from the parental permission form. The researchers will comply with the confidentiality statute (ESRA) and will not violate confidentiality in that manner.


  1. Please clarify specifically what data products will be produced, such as tables, microdata, etc. What disclosure review procedures are in place for such products?


The only data products that will be produced in this study will be part of the reports and journal articles outlined in A16.


The primary concern when sharing data from any study is the protection of human subjects. Data intended for broader use should be free of identifiers that would permit linkages to individual research participants and variables that could lead to deductive disclosure of the identity of individual subjects. Our proposed data sharing plan includes complete de-identification of the proposed datasets so ALL of the data in the de-identified dataset can be included and shared. Examples of data that we consider for de-identifying on a variable or field level are the following: comment fields, optional fields, investigator, and site name. De-identification of a dataset means removing the following variables:


• Names

• Geographic information (including city, state, and zip code)

• Dates

• Telephone and fax numbers

• Electronic mail addresses


  1. Please clarify that maximum number of schools that could be in the sample based on the sampling procedures discussed in part B.


At this point, we cannot provide the exact maximum number of schools that will be in the sample based on the sampling procedures discussed in B1. Our goal, based on the power analysis for the study, is to have between 60 and 80 schools. As noted in part B1, we will attempt to recruit every elementary school with kindergarten in the universe of districts. If every one of the 84 schools with kindergarten in the targeted Mississippi Delta districts were willing and able to participate in the evaluation, then the maximum number of schools that could be in the sample would be 84.


However, we anticipate that some Delta schools (and/or districts) may be unwilling or unable to participate. If fewer than 60 schools in the Delta agree to participate we will expand the targeted recruitment area to neighboring and demographically similar districts. Districts will be added one at a time to the universe. As in the Delta, we will attempt to recruit every school with kindergarten in each newly added district. Once we have sampled at least 60 schools, we will stop adding new districts from the surrounding area to the universe. This sampling approach enables us to have a coherent, complete frame, in which the results can be interpreted as inclusive of all willing and able schools in the included districts.


For example, if we recruit 50 schools from the Delta, we will expand recruitment to neighboring and demographically-similar districts outside the Delta. If the first district added to the universe includes 8 schools with kindergarten and we are able to recruit 5 schools, we would then have a sample of 55 schools. At which point, we would add another district to the universe. If that district had 6 schools with kindergarten, we would attempt to recruit all 6. In that case, the maximum number of schools that could be in the sample would be 61. However, if we were only able to recruit 4 schools, we would then have a sample of 59 schools and would add an additional district to the universe. If the new district had 12 schools with kindergarten, we would attempt to recruit all 12 schools. In that case, the maximum number of schools that could be in the sample would be 71 schools. Our schedule for recruitment during the current school year (SY 2007-2008) allows sufficient time for the staged roll-out of the recruiting, as we will not begin the intervention until SY 2008-2009.


This is a hypothetical example to illustrate how the maximum number of schools that could be in the sample will vary depending on the total number of schools eventually drawn into the universe. In the end, we will have a study that tests the effectiveness of the PAVE intervention in a diverse but conceptually coherent set of Mississippi Delta-area school districts, in which kindergarten vocabulary and early language development face particular challenges.


We have included this hypothetical example in section B1 to better illustrate the sampling process.


  1. What is the basis for the teacher and student response rate estimates?


With students, we will be conducting individual child assessments, which will include standardized measures of achievement and a language sample. Student assessments will be conducted one-on-one with a researcher in a familiar location in the school. All student measures will be collected during a single assessment period lasting 40 minutes to an hour. Students will be assessed on three occasions over a two-year period.


As noted in part B (sections B1 and B3), we expect a student response rate for the individual child assessments of at least 85% at the end of the first grade follow-up one year after the intervention. This assumption is based on our prior experience with a study using similar recruitment and data collection procedures that also looked at the impact of a school-based intervention on elementary students’ performance on measures of achievement. This study, the Evaluation of the School Breakfast Program Pilot Project, had response rates of 80 to 100 percent, depending on the measure, with the higher rates for the standardized assessments that were done in the school setting, as will be the case in the current study.


Data collection involving both treatment and control teachers includes a brief demographic questionnaire at the start of the intervention year and two classroom observations (in the fall and the spring of the intervention year). In addition, during the subsequent school year, we will collect additional data from treatment teachers; we will observe the classroom (for a fidelity assessment) and interview teachers briefly about the challenges of implementing the PAVE intervention.


Among teachers in the sample, we expect a high response rate for the classroom observations and demographic questionnaires during the intervention year (when intervention impacts on teachers will be assessed) largely because the sample will be comprised of consenting teachers. Because the group of teachers in the sample will have all consented to participate, they are likely to have a higher response rate than among a sample selected prior to obtaining consent.


As noted in part B3, we anticipate a lower response rate (approximately 80%) for the fidelity assessment and interview with treatment teachers in the year following the intervention. However, this lower response rate will affect our investigation of the sustainability of the PAVE intervention among treatment teachers only, but not our examination of PAVE impacts.


  1. Will the duplicate data sets be under the same management and security controls? Why is creating duplicate datasets necessary for quality control? We have not seen other RELs proposing this approach.


To clarify the statement in section B2a, we were referring to ensuring the accuracy of data entry. Data entry will be completed independently by two different members of the research team. Any inconsistencies in data entry will be identified and rectified in a master database. Once errors are corrected, any duplicate copies of the database will be deleted. While in existence, any and all datasets will be under the same management and security controls. If OMB is concerned about the double entry of data to ensure accuracy, we will not follow this procedure and will have data entered only once. However, it is a fairly common quality control measure for data entry, as it eliminates common keyboarding errors by data entry staff.


The original text in section B2a, which described the process of transferring written data collected during the classroom observation into an electronic database, appeared as follows:

Observers will document teachers’ instructional practices on a written document. Codes from the written document will be entered into an electronic database. To ensure accuracy, all data will be entered again into a duplicate electronic database. Data in both databases will be compared, and all discrepancies will be identified and rectified.

For clarity, we removed the confusing third sentence: “To ensure accuracy, all data will be entered again into a duplicate electronic database.” The edited paragraph in section B2a has been modified to read:

Observers will document teachers’ instructional practices on a written document. Codes from the written document will be entered into an electronic database.


  1. Please clarify when in the process the teacher consent form is provided to the teachers.


After schools have agreed to participate in the study, but prior to random assignment, the teacher consent form will be provided to teachers. Random assignment will not take place until after teacher consent forms have been obtained.


A statement indicating that teacher consent will be obtained prior to random assignment was added to section B1, in a footnote, and to the main text in section B3.


  1. Is the language for the teacher consent, parent consent, and script to the child based on other studies? If not, have they been pretested?


The language for the teacher consent and parent permission forms and the script for obtaining children’s assent for the assessments were pretested as part of the pilot test in Georgia described in section B4. We did not encounter any problems with the language on the forms or in the script for obtaining children’s assent for the assessments. We have edited the description of the pilot test in B4 to indicate that the forms and the script were successfully pretested. In addition, the script for obtaining children’s assent to participate was used successfully in the previous evaluation of PAVEd for Success in Georgia prekindergarten.


A good part of the language in the teacher consent and parent permission forms are dictated by Institutional Review Boards, including the language on confidentiality; voluntary nature of participation and ability to withdraw from the study; handling of collected data, including the destruction of paper and electronic files after a specified time; and contact points.


  1. Who developed PAVE?


The researchers that developed PAVE include Claire E. Hamilton (University of Massachusetts), Paula J. Schwanenflugel (University of Georgia), Stacey Neuharth-Pritchett (University of Georgia), and M. Adelaida Restrepo (Arizona State University). Drs. Schwanenflugel and Neuharth-Pritchett will be implementing the intervention as part of this study.


We have added a footnote to the introductions in part A and in part B identifying the intervention developers.


  1. Can we get a copy/link to the prior evaluation?


Yes. We are attaching a copy of the manuscript submitted for publication to the Journal of Literacy Research on September 14, 2007. The reference for the manuscript, which reports results from the evaluation of PAVE in Georgia Pre-K is:

Schwanenflugel, P. J., Hamilton, C. E., Neuharth-Pritchett, S., Restrepo, M. A., Bradley, B. A., & Ruston, H. P. (under review). PAVEd for Success: An evaluation of a comprehensive preliteracy program for 4-year-old children. Athens, Georgia, UGA, Manuscript submitted for publication.


In addition, we are attaching a copy of the following chapter, which reported initial results from the evaluation of PAVE in Georgia PreK:

Schwanenflugel, P. J., Hamilton, C. E., Bradley, B. A., Ruston, H. P., Neuharth-Pritchett, S., & Restrepo, M. A. (2005). Classroom practices for vocabulary enhancement in prekindergarten: Lessons from PAVEd for Success. In E. H. Hiebert & M. L. Kamil (Eds.), Teaching and learning vocabulary: Bringing research to practice (pp. 155-178). Mahwah, NJ: Lawrence Erlbaum Associates.


  1. The package makes the assertion that routine instructional content in Kindergarten includes phonological awareness, alphabet, and print components. On what is this assertion based? What if this does not end up to be the case in the classrooms in the study? Do you plan on evaluating the impact on outcomes other than those related to vocabulary?


In the Overview section of part A, we have clarified the discussion of the adaptation of the PAVE intervention for kindergarten. We have edited the referenced paragraph as indicated below to specify the basis for the assertion that phonological awareness, alphabet, and environmental print are routinely covered in kindergarten. In addition, we have provided information about the Mississippi Language Arts Curriculum Framework, which include kindergarten standards for letter and word recognition, as well as phonological and phonemic awareness.


For the current project, the PAVE intervention is adapted for kindergarten and modified to focus primarily on vocabulary learning. Other areas of the PAVE prekindergarten program (i.e., alphabet, phonological awareness, and environmental print) are routinely covered as part of kindergarten language and literacy instruction and therefore are not included in the kindergarten professional development program. For example, the phonological awareness program adopted as part of PAVE was a popular kindergarten program called Phonological Awareness for Young Children (Adams et al., 1999). In addition, according to estimates by kindergarten teachers from across the United States, a nearly equal amount of time is spent on teacher-directed instruction in reading, numbers, and the alphabet (Heaviside & Farris, 1993; Gaurino et. al, 2006). Furthermore, the Early Childhood Longitudinal Survey-Kindergarten Cohort measured basic reading skills, including recognizing the printed word (i.e., both orthographic and phonological skills), vocabulary, and reading comprehension (Denton, West, & Walston, 2003). Results from the ECLS-K study indicated that about two-thirds of kindergarteners in the U.S. knew the letters of the alphabet upon kindergarten entry, with one third knowing the letter sound relationship (an aspect of phonological awareness). Thus, alphabet and phonological awareness seem either something that children come into kindergarten knowing or that teachers will focus on it as part of general literacy instruction.


Furthermore, the Mississippi Language Arts Curriculum Framework (http://www.mde.k12.ms.us/acad1/frameworks/LA_Framework_Introduction.pdf) explicitly outlines standards for kindergarteners’ letter and word recognition, as well as phonological and phonemic awareness. Specific objectives include:

    • The student will apply knowledge of concepts about print.

(involves demonstrating book concepts; matching spoken words to print; tracking words from left to right; distinguishing letters from words; distinguishing upper and lower-case letters);

    • The student will apply knowledge of phonological and phonemic awareness.

(involves recognizing beginning, final, and some medial sounds in spoken words; blending phonemes orally to make spoken words; and segmenting phonemes orally within spoken words); and

    • The student will use word recognition skills.

(involves matching consonant and short vowel sounds to letters; blending letter sounds in one syllable words; and reading high frequency & sight words).

According to these standards, alphabet, print, and phonological knowledge are content areas covered as part of the kindergarten language arts curriculum.


In addition to measuring impacts on vocabulary, we will also assess students’ listening comprehension and, in first grade, their broader literacy achievement. Furthermore, in both treatment and control classrooms, we will conduct classroom observations that document teachers’ literacy instructional practices. We will document both vocabulary and non-vocabulary literacy instruction. Based on the classroom observation, we will be able to compare the amount of time devoted to vocabulary and other non-vocabulary literacy instruction in both treatment and control classrooms.




  1. Can we get copies/links to existing studies on vocabulary development that are cited?


Yes. We have attached copies/links to referenced studies on vocabulary development.


  1. Do you plan to collect any information on what the existing professional development in the district(s) looks like?


Our plan was to collect this information from state administrative records; however, we have recently learned that this information is not available from state records. Instead we will need to obtain information on existing professional development in the districts as part of the recruitment process. We have developed a set of questions that we will include in our school district recruitment protocol. These questions have been added to part A as Appendix B, and the time for district administrators to respond to the interview has been added to the burden estimate table.


  1. On pg. 10, you assume there will be no “sleeper effects.” What is the research base for this assumption?


We have added the following paragraph to section A6 to explain of our assumption that no “sleeper effects” of important magnitude will occur:


The basis for not anticipating “sleeper effects” stems from the fact that vocabulary and other literacy skills develop in a sequential fashion. Early skills lay the foundation for later development. Correlational research consistently finds that early oral language, vocabulary, and other preliteracy skills are related to later language and literacy skills, including better reading comprehension (Storch & Whitehurst, 2002; Tabors, Snow, & Dickinson, 2001). The theory behind intervening early to improve children’s vocabulary skills (or abilities in other domains, for that matter) is that doing so will put children who are at risk of poor outcomes on a better trajectory toward successful outcomes. However, children who still lag behind after receiving an intervention like PAVE, without any apparent benefit relative to control group students, can be expected, based on this research, to remain behind in subsequent years, with little or no room for late-emerging impacts. Without improvement in foundational vocabulary skills, children cannot keep pace in their development of more advanced vocabulary and other literacy skills supported by a strong vocabulary.


  1. Is information available on any other major interventions being implemented in the district/schools involved in the study during the study period? Do you think this could affect the willingness of schools to participate?


We have anticipated that schools’ involvement with other reading interventions could affect both intervention impacts and willingness to participate. For this reason, we are stratifying the schools based on their prior experience with other reading interventions and then randomizing schools from each strata (e.g., with and without involvement with other reading initiatives) in order to ensure that there are equal numbers of schools from each strata in both treatment and control conditions. If schools with and without experience with other reading initiatives have unequal participation rates, we will be aware of this fact. We will note the presence of unequal participation rates and interpret our overall findings as characterizing the contribution of PAVE in schools engaged in other recent or concurrent reading initiatives, which is the most prevalent context for the vocabulary-focused intervention in our sample and likely in other areas of the nation as well. Also, treatment and control conditions will be equally affected by unequal participation rates of schools that vary in their prior experience with other reading initiatives, so the internal validity of the study will not be threatened.


Approximately 60 of the 84 target schools in the Mississippi Delta have other reading interventions being implemented, including Reading First, the Mississippi Reading Sufficiency Program, and the Barksdale Reading Initiative. None of these reading interventions are being evaluated, so there are no research demands on participating schools that could interfere with their willingness to participate in this study.


As part of our preliminary recruitment efforts, through meetings with MS state and district administrators, no concerns have been expressed to suggest that schools’ involvement with other reading interventions will reduce their ability or willingness to participate. State education administrators in MS also stated that they do not know how fully any of the other interventions were being implemented, and in any case, the PAVE intervention is complimentary to these other initiatives. They felt that the targeting of vocabulary in PAVE was an unmet need in many areas in the State, including the Delta, with or without these other initiatives. The MS Department of Education liaison for the study is the Director of MS Reading First, and she is extremely supportive of PAVE within the overall context of her State. District administrators, even those involved with other interventions, have expressed interest in being involved in this study.


  1. Please provide the cost of the intervention, both in terms of dollars and in terms of staff (teacher, coordinator) time.


The estimated cost of the intervention both in terms of dollars and staff (teachers/coordinators) time of The Effectiveness of a Program to Accelerate Vocabulary Development in Kindergarten is $454,608. The intervention costs include treatment teacher training and follow up to take place the summer of 2008 (Year 3) and continue through the 2008-09 school year, and training to be conducted the summer of 2009 for the control group (Year 4). This information has been added to section A.14.


  1. Will this study, if it finds an effect, meet the criteria for inclusion in the What Works Clearinghouse?


Yes. A study that employs random assignment, as this study will, would be rated by the What Works Clearinghouse as “Meets Quality Standards.” In addition to randomization, the following criteria must be met:

  1. The study must establish that the treatment and control conditions are comparable at baseline. If the groups are not comparable after random assignment, statistical adjustments must be made in the analysis.

  2. There must not be severe overall attrition or severe differential attrition; however, if post-attrition comparability of groups can be demonstrated, the study is still considered to Meet Quality Standards. If severe overall or differential attrition occurs and results in non-comparable groups, the study could be rated, “Meets Quality Standards with Reservations” and would still be included in a review by the What Works Clearinghouse.

  3. There must not be intervention contamination (i.e., something that occurs after the beginning of the intervention that affects the outcome for the treatment or control condition but not both). If intervention contamination occurs, the study would be rated, “Meets Quality Standards with Reservations.”

  4. There must be more than one teacher per condition (i.e., no teacher-intervention confound).

  5. The unit of assignment and the unit of analysis must match.


In this study, we will assess the baseline comparability of the groups, and, if necessary, adjust in the analysis for any non-comparability.


We do not anticipate severe overall or differential attrition in this study; however, were severe attrition to occur, the study could still meet criteria for inclusion in the What Works Clearinghouse.


We do not anticipate intervention contamination.


In this study, there will be no teacher-intervention confound, and the unit of assignment and the unit of analysis will match.


  1. Can we learn something about the replicable elements of the intervention rather than just the intervention as a whole (strategies/teaching approaches, etc.)?


We will not be able to test impacts of specific intervention elements (e.g., strategies/teaching approaches) or of varied combinations of intervention elements. We will only be able to examine the impacts of the PAVE intervention as a whole compared to the existing language arts instruction absent all elements of the PAVE intervention.


However, we will examine the extent to which teachers sustain the PAVE intervention in the subsequent school year. We will interview intervention teachers about the aspects of the intervention that have been most useful and easiest to implement, as well as about the elements of the intervention that have been most challenging to implement. In addition, we will ask intervention teachers about which elements of the intervention they plan to continue using.


10


File Typeapplication/msword
File TitleAbt Memorandum Template
AuthorBernsteinL
Last Modified ByMatsuoka_k
File Modified2007-12-05
File Created2007-12-05

© 2024 OMB.report | Privacy Policy