Att_1875 nEW 4488 OMB Supporting Statement B_RC Edits Mar 24 2011_NTC

Att_1875 nEW 4488 OMB Supporting Statement B_RC Edits Mar 24 2011_NTC.docx

Language Instruction Educational Programs (LIEPs): Lessons from the Research and Profiles of Promising Programs

OMB: 1875-0259

Document [docx]
Download: docx | pdf

PART B: COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS



B1. Site Selection Sampling Methods


A purposive pool of 37 promising schools and/or school districts was assembled in the first phase of the study. From this pool, 20 sites will be selected for the second phase of the study. The identification and selection will be based on a two-tiered process. The first tier consisted of identifying schools and/or districts (1) with existing LIEP programs that have already demonstrated successful outcomes, including AYP for EL students or making substantial progress on AMAOs for ELs, (2) that qualify as Title III districts or schools, and (3) that have programs that have been in place for at least three years. Identification of these schools and/or school districts will come from existing, online data on outcomes from sources such as EDFacts, available data on progress toward achieving state or district goals for ELP and nominations from Title III State and District directors. The request for nominations was made through informal outreach to State Title III officials via ED’s monthly Title III Directors webinar. Specifically, the research team was invited by the ED Program Office to provide an overview of the study and its intended outcomes during an ED-sponsored monthly webinar with State Title III Directors. The research team indicated that the nominated sites should represent a range of LIEP types (i.e., English immersion, bilingual, newcomer). The sites should represent variations in EL student diversity (i.e., various language groups and countries), have a significant EL population, and include urban, rural, and suburban schools and varied geographic locations. Additionally, the nominated sites should be implementing a promising LIEP, have adequate resources and supports in place to sustain their program, and have a system in place for evaluating program success. During the webinar, the research team requested that nominated districts/schools be communicated to the research team director or via the Program Office Listserv. Once nominations were received, student outcome data were gathered by the research team to validate nominated sites.

In tier two, the overall population of schools identified will be reduced to a sample of 20 sites based on contextual factors applied to capture a representative sample. The scope of work developed by ED PPSS limits the number of sites to a maximum of 20. Because the goal of the study is to provide detailed information about the widest possible variety of successful EL programs, it is not necessary (nor possible based on the small sample size) to create a statistically representative sample based on a random sampling from the available pool. Rather, a purposive sample will be determined with the goal of having a variety of examples that demonstrate program components existing within the full range of demographic, programmatic, and external environments. Therefore, the 20 sites will be selected with the aim of having examples within each group for each of the relevant contextual factors where possible. Based on the existing literature on educational impact factors and the expert opinion of the Advisory Panel, the contextual factors determined are:


  • Urbanicity (inner city, suburban, rural)

  • Grade Levels (Elementary, Middle, High School)

  • Languages Served – some of the dominant languages as well as other languages served (e.g., Spanish, Asian languages, Native American languages, Other European languages, African languages)

  • Poverty level – variety of levels to include some with more than 50% and at least one or two with more than 75%

  • Minority percentages

  • Consolidated State Performance Report (CSPR) Model Types – variety captured (major category such as Immersion, Bilingual, Newcomer as well as specific approaches – Sheltered Instruction, Transitional Bilingual Education, Developmental Bilingual Education, etc.)


B2. Information Collection Procedures


For phase two, the LIEPs at the 20 sites selected will be fully described utilizing a proposed set of interview protocols and an observation instrument. The site visit team will be trained in observation techniques and interview and/or focus group facilitation. A team, consisting of two evaluators, will conduct each 4-day site visit. As part of the event, they will visit participating school(s). At each school, they will observe up to six activities. Additionally, the team will conduct interviews with each of the respondent groups (except for State Title III interviews which will be conducted by phone prior to visits). At the start of each interview, all participating respondents will receive and will read the informed consent document. The interviewer will review the form, pointing out the voluntary nature of their participation as well as confidentiality assurances. Each respondent will sign the form and retain a blank copy for their reference. All interviews will be recorded as back-up to researchers’ note-taking. Two evaluators will be present for all data gathering activities to maximize the number of relevant data points observed and captured. Moreover, this approach supports achieving inter-rater reliability of data. All activity observations will be documented using the observation instrument, which is designed to capture descriptive information on the practical application of the program parameters along 5 of the 12 key characteristics (e.g., staffing and professional development, collaboration, curriculum and instruction, support for implementation, and family and community involvement) described in existing literature as essential to program success. The remaining 7 key characteristics (political context, approach to external pressures, organizational context, leadership and advocacy, use of data, evaluation, and accountability) are not readily observable given the observation types associated with this study. However, all 12 characteristics are addressed through the interview protocols. It should be noted that this is qualitative study and as such will not involve statistical techniques that generate inferences that can be generalized broadly. Rather, data collected will be categorized based on common themes and discussed in case study format in the resulting Guide.


  1. Statistical Methodology


This study involves collection of qualitative data. A discussion of statistical methodology is not applicable to this study.


b. Analysis Methods


All data collected will be subjected to qualitative analysis procedures. Using text recognition (e.g., PCAP analysis software) and coding processes, open-ended information gathered in the interviews and observations will be categorized as program features. The coded data will be subjected to clustering to identify common characteristics of successful programs and characteristics of programs within specific contexts – for example, characteristics of programs in high poverty areas, or of programs in the immersion model.


  1. Degree of Accuracy Needed


The research team will do everything possible to maximize the accuracy of collected qualitative data. First, we will pilot our instruments and adjust them accordingly. All site visit data collectors (a subset of the full research team) will attend a one-day, in-depth training that deeply familiarizes them with the instruments and trains them on interview and observation techniques. Site visits will be conducted with two-person teams to ensure inter-rater reliability. While on-site, interviews will be recorded, in addition to being transcribed by hand, by one of the team members. All site visit team members will be required to prepare draft case study summaries within one week of each visit. In terms of data verification and analyses, the data analysis staff (a subset of the full research team) will develop a codebook for the study and each coder will participate in coding training.


B3. Methods to Maximize Participation Rates


The nomination process from State Title III directors has served as an important incentive for possible sites to participate in the study’s site visits. Because of this process, we anticipated 100 percent participation. However, as added assurances to to maximize participation rates among the participating sites , the following activities will take place.


  1. Prior to site visits, the U.S. Department of Education will provide participating sites with information about the study and its importance for the field. This letter will also include information on the major topics addressed in the interviews, the purpose of the observations and how to learn more about the study from ED.

  2. SEI and edCount will send all participating sites an email with an explanation of the study, an overview of the site visit process, and an email address and phone number to call with questions. Participating sites will be oriented about confidentiality, how the data will be used, and will be introduced to their specific site visit team members at the appropriate time. Participating sites will work with the SEI/edCount team’s Site Visit Coordinator to prepare the logistics necessary for the visit in the 8 week period preceding each visit. Reminder emails and/or calls will be made to the site at least three weeks and again one week prior to the visit to confirm all arrangements.


An important challenge in conducting the site visits will be to ensure continued support for study participation during the time lapse between site selection (April-May 2011) and site visits (September 2011). In many schools and districts, turnover can be high between school years. Thus, leaders who originally supported participation may no longer be with the district in that capacity. To address this issue, the research team will identify an individual at each site who will serve as the main point of contact (POC). We will remain in regular contact over the spring and summer months preceding the visits. We will provide the POCs updates and work with them to prepare for the site visits so that they run as efficiently and smoothly as possible. This regular, consistent communication will allow the team to respond quickly to any attrition. Specifically, should attrition occur, the researcher team will maintain a back-up pool of up to 5 sites that can be substituted for any participating sites that drop out. Using this approach, we anticipate that we will successfully complete all 20 visits.


B4. Pretesting of Instruments


We have conducted internal pretesting of the items designed for these instruments to ensure clarity. We will conduct a pilot site visit with a non-participating school district in the DC Metro area between March-April 2011. There are no more than nine individuals whose roles are similar to those we will sample who will participate in the pilot.

B5. Individuals Consulted on Statistical Aspects of Design


Contact Information: Synergy Enterprises

Rhonda L. Crenshaw, M.A., Project Director

Synergy Enterprises Inc.

8757 Georgia Ave. Suite 1440

Silver Spring, MD 20910

[email protected]

(240) 485-1700


Kathy Zantal-Wiener, Ph.D., Study Design and Data Collection Team Leader

Synergy Enterprises Inc.

8757 Georgia Ave. Suite 1440

Silver Spring, MD 20910

[email protected]

(240) 485-1700


Kate Tindle, Ed.D. Study Outreach Team Leader

Synergy Enterprises Inc.

8757 Georgia Ave. Suite 1440

Silver Spring, MD 20910

[email protected]

(240) 485-1700


Deborah Lessne, M.A., Study Data Analyst

Synergy Enterprises Inc.

8757 Georgia Ave. Suite 1440

Silver Spring, MD 20910

[email protected]

(240) 485-1700


Contact Information: edCount

Ellen Forte, Ph.D., Principal Investigator

edCount, LLC

5335 Wisconsin Avenue NW

Suite 440

Washington, DC 20015

[email protected]

(202) 895-1502


REFERENCES


August, D., & Shanahan, T. (Eds.) (2006). Developing Literacy in Second-Language Learners: Report of the National Literacy Panel on Language-Minority Children and Youth. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.

Bailey, A.L. (2007). Language Demands of School: Putting Academic English to the Test. New Haven, CT: Yale University Press.

Burstein, L., McDonnell, L., Van Winkle, J., Ormseth, T., Mirocha, J., & Guiton, G. (1995). Validating National Curriculum Indicators. Santa Monica, CA: RAND.

Butler, F.A., Bailey, A.L., Stevens, R., Huang, B., & Lord, C. (2004). Academic English in Fifth-Grade Mathematics, Science, and Social Studies Textbooks (CSE Tech. Rep. No. 642). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

Mayer, D. (1999). Measuring Instructional Practice: Can Policymakers Trust Survey Data? Educational Evaluation and Policy Analysis, 21, 29–45.


Scarcella, R. (2003). Academic English: A Conceptual Framework. University of California, Santa Barbara: Linguistic Minority Research Institute Newsletter.

Short, Deborah J., Hudec, J. & Echevarria, J. (2002). Using the SIOP Model: Professional Development Manual for Sheltered Instruction. Washington, D.C.: the Center for Applied Linguistics.

Torres, R.T., Preskill, H., Piontek, M.E. (2005). Evaluation Strategies for Communicating and Reporting. Thousand Oaks, CA. Sage Publications.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAndrew Abrams
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy