Responses to OMB comments

correct and final 1875-NEW (3992) commentsversion of Responses to OMB comments 091709 (2).docx

Evaluation of State and Local Implementation of Title III Standards, Assessments, and Accountability Systems

Responses to OMB comments

OMB: 1875-0254

Document [docx]
Download: docx | pdf

Responses to OMB comments and questions on 200904-1875-002: Evaluation of State and Local Implementation of Title III Standards, Assessments, and Accountability Systems 09.11.09

1. General comments/questions

Response

a. There are multiple references throughout the package to additional detail being available in "our next submission." To what is this a reference?

These references will be deleted. They were left over from the initial submission to ED.

b. Student records:


i. What is the status of efforts to identify states that can meet the criteria for student records?

This effort is complete.

ii. Please describe what method was used to determine which states had that information?

We based our analysis of states’ data availability on the study of Maria Perez and Miguel Socias, (forthcoming) “What’s Out There? An Overview of Available Data for Education Research” (International Encyclopedia of Education, 3rd Edition, ELSEVIER). This study provides an overview of the student-level longitudinally linked data systems available in the United States, and is up to date. Another source we consulted was the information collected through the Data Quality Campaign’s annual state survey (http://www.dataqualitycampaign.org/about/partners/managing), which identifies which state data systems feature key elements for longitudinal data analysis. We also double-checked the availability of these data during our initial phone call (with the pilot of less than10 states) in order to determine their final eligibility and willingness to provide data.

iii. What is the status of obtaining those student records?

We have obtained the student achievement data from four of the seven states listed in exhibit 9: the Southern state, the two districts in the Western state, the second Southeastern state, and the Southwestern state. The Northeastern state will upload the datasets later this month. We expect to have the data from the first listed Southeastern state by mid-October. Note that two states, the proposed Mid-Atlantic state and a Midwestern alternate state, did not meet the eligibility criteria and were ultimately not included in the collection due to insufficient data (i.e., their longitudinal student-level data systems were only recently put in place).

c. For what specific data elements will the CSPR, Accountability Workbooks, Title III Biennial, and other Department of Education sources of information be used?

Please see the attached list of extant data elements following this table labeled Comment 1c.

i. Will an effort be made to verify what is reported in these documents?

In state interviews, we will provide state Title III directors with a prefilled data confirmation document and request that they correct or update their state’s reports. We have also arranged to compare the extant data with data that the National Clearinghouse on English Language Acquisition (NCELA) is compiling for the Title III Biennial Report.

ii. The calendar says these reviews will take place after some of the other study activities - wouldn't it be important to review these documents first to get a general sense of the LEP definition, the accountability system, etc.?

We agree. We actually have already reviewed these documents as a result of some earlier discussions with ED.

d. How do you plan to deal with States with multiple definitions of LEP/ELL - for example, one definition for Title III purposes and one definition for Title I purposes or different criteria for exiting the subgroup depending on the program involved? The report will need to be very clear about who is defined as LEP when making comparisons of LEP student performance on content assessments, for example.

Given the focus of the study on Title III, we plan to use the definition of LEP for Title III purposes. As one topic on the upcoming state interviews, we will specifically ask states about their Title III definitions of LEP and will then refer to that definition throughout data collection.

e. When looking at state and local data systems and what data are collected, are there specific data elements you will be looking for in every case?

We assume this refers to the student-level data files and the data elements that were requested from those states. States provided nearly all of the data elements requested for the pilot but varied slightly in their submissions. The elements that we requested for the student-level data files are:

  • Unique pseudo student identifiers (consistent over time to replace unique student ID)

  • Unique School/District I.D. or unique pseudo identifiers (consistent over time)

  • Student-level scale scores in state content assessments in reading/English language arts (R/ELA) and mathematics

  • Student-level proficiency levels in state content assessments in reading/English language arts (R/ELA) and mathematics

  • Student-level scale scores in ELP assessments (for each domain and composite score)

  • Student-level proficiency levels in ELP assessments (for each domain and composite score)

  • ELP assessment grade cluster form that was administered

  • Limited English proficient status

  • Information on student’s years-in-program & type of program (if available)

  • If redesignated, date or year of redesignation

  • Student with disability status and special education services received (if available)

  • Test accommodations used

  • Grade

  • Birth Month and Year

  • Gender

  • Primary ethnicity

  • Primary language

  • Eligibility for free or reduced price lunch

  • Years living in the U.S.

  • Years a student has been attending a school in that district or state


If this question is referring to the subgrantee/district survey questions on district-level data systems, the survey will ask whether the district has a data system containing any of the following elements:

  • Current ELL status

  • ELL status in previous school years

  • ELL services received in current year

  • ELL services received in previous years

  • Native language

  • Years living in U.S.

  • Experienced interrupted schooling

  • Academic achievement of former ELLs

  • English language proficiency test scores

  • State content area test scores

  • Identity of ELL’s mainstream teachers

  • Identity of ELL’s specialist teachers

  • Qualifications (e.g., degrees, certifications, endorsements) of mainstream classroom teachers of ELLs)

  • Qualifications of specialist teachers (ESL, ELD, bilingual,
    and LIEP teachers)

2. Supporting Statement - Part A


a. Page 11 - On Page 10, the supporting statement notes that ED will not seek PRA approval for some of the activities, including interviews in the "standards review" study component, and efforts to determine which states can provide sufficient student assessment data. We would normally consider this to be included under the PRA; could ED please explain why the burden is not included for these activities - or include it here?

We did not think that this part of the evaluation needed to be included under the PRA because it was part of the pilot and our original plan was to contact only up to 9 states (if necessary) to determine whether we had the correct standards document. It turned out that we did not need to contact any states for this work. We also did not think that the student assessment data work needed to be included because our plan was to contact only up to 9 states in order to request these data. As it has turned out, we only needed to contact eight states.


In both of these activities, it was our understanding that we did not need to include burden hours for data requests involving fewer than 10 respondents.

b. Page 15 - Testing


i. What is the current status of pilot and cognitive testing?

The pilot and cognitive testing are complete. Please see the section following this table labeled Comment 2.b.i for proposed revisions made to 2 of the protocols as a result of an internal review and further analyses of the pilot testing data.

ii. Has ED examined possible differences among questionnaire modes of administration?

ED required the contractor to include a discussion of the pros and cons of each mode of administration in the proposal and then examined the issue with the study’s technical working group.

ED and the contractor considered three possible administration approaches for the district survey: (1) telephone survey; (2) mail survey with telephone follow‑up (with the option for telephone administration for a small number of respondents); and (3) online Web survey with mail notice and multiple-mode follow‑up (with the options for telephone administration or a mailed hardcopy response for a small number of persistent non‑respondents). The advantages of a telephone survey are typically a higher response rate and the ability to collect open‑ended responses. The advantages of a mail survey are lower cost than a telephone survey, ability to ask a larger number of questions, presentation of a greater number of response options, and allowance for a respondent to lookup or ask colleagues for accurate factual and numerical responses. The advantages of a Web survey are lower cost than a telephone survey (Web site development costs are offset by the elimination of some coding and data entry costs), ability to ask a larger number of questions, presentation of a greater number of response options, allowance for a respondent to lookup or ask colleagues for accurate factual and numerical responses, elimination of the need to carefully interpret skip patterns, constant access to the questionnaire instrument, ease of questionnaire submission, and ability to automate and customize e‑mail follow‑up to nonrespondents at specified time intervals.


After considering the options, we have chosen to propose a Web‑based survey because (1) the SOW requested a survey that collects concrete, detailed, and factually‑based information rather than open‑ended responses; (2) case study interviews will be available to collect open‑ended, text‑based descriptions; (3) our experience indicates that this survey may include a substantial number of items that require lookup; (4) the differences in response rates are negligible if initial non‑respondents are offered multiple mode follow‑up options for telephone administration and mail response; and (5) the respondent population of district Title III administrators for this survey is expected to have excellent computer and Web access and to have appropriate online skills and experience to participate in Web-based surveys (the pilot testing has confirmed this expectation).


c. On page 17, in the discussion of student level data please clarify whether ED will be looking at "average growth and length of time required" to attain proficiency on language or content assessments or both.

We will be looking at both.

Analyses of average growth “will be conducted separately for each outcome measure (e.g., four sub domains of ELP assessments, reading, mathematics).” (P. 17 of Supporting Statement Part A)

The analysis on length of time “concerns how much time it takes a LEP student to move from one level of English proficiency to the next. In other words, we will focus not only on the likelihood of being at a certain proficiency level but on how quickly (or slowly) LEP students move to proficient levels. In a similar analysis, we will examine how much time it takes LEP students who attain proficiency on the state ELP assessment to reach a proficient level on the state content assessments.” (P. 18 of Supporting Statement Part A)

d. Pages 17 & 18 - When looking at LEP student characteristics and performance, will ED disaggregate to see whether these characteristics are the same or different for LEP for Title I, LEP for Title II, and monitored former LEP for Title III, and former LEP for Title I?

Given the study’s focus on Title III, we plan to use the definition of LEP and monitored former LEP for Title III purposes.

e. Page 18 - In looking at the length of time to attain proficiency, the Supporting Statement discusses the length of time before redesignation in California. In some States, students who are proficient in English (as demonstrated by an ELP assessment or in other ways) are not necessary redesignated. Will the study also look into State redesignation practices and describe the interaction between those practices and attaining language proficiency/ ELP assessment scores?

California is a case where students who are proficient in English (as demonstrated by an ELP assessment or in other ways) are not necessarily redesignated. We will discuss this topic in interviews with state Title III directors and in case study district interviews and analyze it as part of the qualitative portion of the study.

f. Page 23 - Incentives


i. We do not agree that incentives are appropriate for mandatory respondents. Given that the districts are sub grantees for whom participation is mandatory, we are uncomfortable with an incentive for the district staff.

No incentives will be provided to mandatory respondents (i.e., district staff). The case study liaisons (who may be district staff) will no longer be given incentives. District staff do not participate in school, teacher, and parent focus groups, which are the only occasions where we are now requesting that incentives will be used.

ii. For focus group participants, why does ED propose gift cards rather than cash, which is the more typical incentive for focus group participation?

If approved, we will provide cash rather than gift cards for focus group participation.

g. Page 24 - Confidentiality


i. Given that personally identifiable student level data is being collected as part of the study, SS A 10 and materials used to communicate with states should reference FERPA.

Personally identifiable student level data is not being collected from students. See page D-11 of Supporting Statement Part B:

On that page we discuss the fact that States or districts will provide the information but will not need to provide actual unique student identifiers. We wrote: “They will be instructed to replace actual names and unique student identifiers with unique pseudo identifiers (random numbers that are consistently associated with the same single student over time). Alternatively, they may format the database such that each row in the data file includes all of the above information across all years for a single student and then strip all unique student identifiers. This formatting approach will enable us to link student characteristics (i.e., gender, ethnicity, etc.) to longitudinal student academic achievement, which is critical for our analysis.”

ii. Please confirm that the contractor arrangements comport with FERPA requirements.

FERPA does not apply because personally identifiable student level data is not being requested from students. Data that is provided will be collected from states and/or districts using unique pseudo identifiers. There is no instance where students are being requested to sign consent forms to allow access to their student data records – or to provide any personal data.

iii. Is the Privacy Act being invoked for other aspects of the data collection or not? The phrasing of A 10 is unclear.

We are not invoking the Privacy Act because personally identifiable student level data is not being collected.

1. If so, is there a System of Records Notice and are other Privacy Act requirements met?

Not applicable.

2. We would like to see the confidentiality language clarified, either strengthened or weakened as is appropriate to the specifics (RIMS should be helpful in this regard).

Please see revised section A10 at the end of this document labeled Comment 2.g.iii.2.

iv. Finally, please clarify exactly to whom the Privacy Act (or any confidentiality pledge) is being made. This seems clear in the SS but not in the supplemental materials, which will need to be clarified before clearance (e.g., district responses).

We have revised the confidentiality pledges in the letters accompanying all instruments. See attached at the end of this document labeled Comment 2.g.iv.

3. Supporting Statement - Part B



a. Page D-2 - Given that many of the queries seem to be "yes/no" or qualitative, please provide some examples of the types of statistics ED plans to generate comparing states where the ability to identify a statistically significant difference of 8 percentage points is relevant.

To clarify, the comparisons are between types of subgrantees (or districts) not states. The basic statistic to be compared is the percentage of Title III subgrantees or percentage of Title III-served ELLs. The statistical test will be a T-test with a criterion of p<.05.

Exhibit 4-5, below at the end of this document labeled Comment 3a, from our analysis plan provides an example of a table shell for the bivariate results on a dichotomous (yes/no) questionnaire item (Q18). The example shows improvement actions implemented as a direct response to the district’s Title III AMAO status by a categorization of number of LEP students in the district. Such table shells may be converted into bar charts for the final report, in which case the supporting tables will be included in an appendix to report more detailed information (e.g., standard errors).


b. Page D-7 - Whose IRB has/will review the protocol? Has that occurred?

The American Institutes for Research’s IRB has reviewed and approved the study and protocols.


4. Supplemental materials



a. Why is the section on "What will the survey require?" included in the focus group FAQs? This is confusing. We recommend deleting

All case study districts will participate in both the focus groups and the subgrantee/district surveys. The District Information Document was meant to be a single brochure that provides a comprehensive introduction to the study for each district.


b. As noted above, the confidentiality pledge will require editing in all instances.

See our response to Comment 2.g.iv.above.


c. In the letter to state superintendents, the sentence "No individual assessment results will be identifiable..." is unclear. We believe you mean "no individual assessment results will be released..." Please clarify, noting to us particularly if ED or its contractors have any plans to release public use micro data.

No individual assessment results will be personally identifiable because states only provide us with pseudo identifiers for all students and schools/districts. We will edit this part of the letter to clarify this information. We do have the ability to link the individual pseudo-ids over time and thereby produce individual student level growth trajectories, but the individual students are not personally identifiable because we do not possess their name, social security number or even their true district-generated ID. There are no plans to release public use micro data (i.e., student-level assessment data files) for this component of the study.


d. In the letter to districts, why point out the survey sample size? We are not aware of any literature that suggests this would boost participation. In fact, we would anticipate it might have the opposite result.

We will delete.





Comment 1c. List of Extant Data Elements

Consolidated State Performance Reports (CSPRs) (2004-05 through 2008-09)

1.2.1 Participation of Limited English Proficient (LEP) Students in Mathematics Assessment

1.2.3 Participation of Limited English Proficient (LEP) Students in Reading/Language Arts Assessment

1.6.1 Language Instruction Educational Programs

1.6.2.1 Number of ALL LEP Students in the State

1.6.2.2 Number of LEP Students Who Received Title III Language Instruction Educational Program Services

1.6.2.3 Most Commonly Spoken Languages in the State

1.6.3.1.1 ALL LEP Participation in State Annual English Language Proficiency Assessment

1.6.3.1.2 ALL LEP Student English Language Proficiency Results

1.6.3.2.1 Title III LEP Participation in English Language Proficiency

Comment 1c. List of Extant Data Elements (continued)

1.6.3.2.2 Title III LEP English Language Proficiency Results

1.6.3.5 Native Language Assessments

1.6.4.1 Title III Subgrantee Performance

1.6.4.2 State Accountability

1.6.4.3 Termination of Title III Language Instruction Educational Programs

1.6.5.1 Immigrant Students - # Immigrant Students Enrolled



Title III Biennial Reports (2002-04 and 2004-06)

2.1.1 Number of LEP Students

2.1.2 Number of LEP Students who Received Services

2.1.5 Unduplicated count of TIII LEP students in the state

2.3 LEP students in grades not tested for AYP

2.4.1 Does the state offer native language academic content tests?

2.5 Accommodations on State academic content assessments for LEP students

4.1 Title III subgrantee performance and state accountability

4.2 Did State meet all 3 Title III AMAOs?

5.1 Number of Immigrants Enrolled in the State



















Comment 1c. List of Extant Data Elements (continued)

EDFacts (2006-07 school year and all of 2007-08)

File Number

Data Group

File or Data Group Name

SEA Level

LEA Level

045 - Immigrant

519

Immigrant Tables

X

X

046 - LEP Demographic

123

LEP Program Tables

X

X

096-LEP Program Instruction

622

LEP Program Instruction Table

X

x

067-Title III Teachers

422

Title III Teacher Table

x

x

047-LEP Eligible

116

LEP Eligible Tables (LEP Tables)

x

x

049-LEP Assessed in Native Language

272

LEP Assessed in Native Languages Table

x

x

050-title III LEP Students English Language Proficiency

151

(Title III) LEP Students English Language Proficiency Table

x

x

103-AYP Status

518

AMAO Proficiency Attainment Status for LEP Students

x

x

103-AYP Status

569

AMAO Making Progress Status for LEP Students

x

x

116-Title III LEP Students Served

648

Title III LEP Students Served Tables

x

x

126-Title III Former Students

668

Title III Former Students Table

x

x

137-LEP English Language Testing

674

LEP English Language Testing Tables

x

x

138-Title III LEP English Language Testing

675

Title III LEP English Language Testing

x

x

139-LEP English Language Proficiency Results

676

LEP English Language Proficiency Results

x

x

141-LEP Enrolled

678

LEP-Enrolled Tables

x

x

X/N103

32

AYP Status

 

X

X/N130

662

Improvement Status-LEA

 

x

X/N111

552

Proficiency Target Status Reading/Language Arts Tables

 

X

X/N109

554

Proficiency Target Status Math Tables

 

X

X/N110

553

Participation Status Reading/Language Arts Tables

 

x

X/N108

555

Participation Status Math Tables

 

x

X/N106

556

Elementary/Middle Additional Indicator Status Tables

 

x

X/N107

557

High School Graduation Rate Indicator Status

 

x

X/N078

584

Student Performance in Reading (Language Arts) Tables

x

x

X/N075

583

Student Performance in Mathematics Tables

x

x

X/N081

589

Students Tested in Reading (Language Arts) Tables

x

x

X/N081

588

Students Tested in Mathematics Tables

x

x

 

 

Associated Metadata for data groups 583 and 584 indicating which levels are counted as proficient in each state.

x

 



Title III Monitoring Reports

Element 2.3 Supplement not Supplant: The SEA ensures that Title III funds are used only to supplement or increase Federal, State, and local funds used for the education of participating children and not to supplant those funds

Element 3.1 English Language Proficiency (ELP) Standards: State English language proficiency standards have been developed, adopted, disseminated, and implemented

Element 3.2 ELP Assessments: ELP assessments have been administered to all limited English proficient (LEP) students in the State in grades K-12. Accountability through data collection has been implemented

Element 3.3 New English Language Proficiency Assessment: Transition to new ELP assessment or revision of the current State ELP assessment

Element 3.4 Annual Measurable Achievement Objectives (AMAOs): AMAOs have been developed and AMAO determinations have been made for Title III-served LEAs

Element 3.5 Data Collection: The State has established and implemented clear criteria for the administration, scoring, analysis, and reporting components of its ELP assessments, and has a system for monitoring and improving the ongoing quality of its assessment systems. Data system is in place to meet all Title III data requirements, including capacity to follow Title III-served students for two years after exiting; State approach to follow ELP progress and attainment over time, using cohort model

Element 4.1 State Level Activities: Using administrative funds, the State carries out one or more activities that include: Professional development; Planning, evaluation, administration and interagency coordination; promoting parental and community participation; or providing recognition to subgrantees that exceeded AMAO requirements

Comment 1c. List of Extant Data Elements (continued)

Title III Monitoring Reports (continued)

Element 4.2 Required Subgrantee Activities: The subgrantee responsible for increasing the English proficiency LEP students by providing high quality language instructional programs and high-quality professional evelopment to classroom teachers (including teachers in classroom settings that are not the settings of language instructional programs), principals, administrators, and other school or community-based organization personnel

Element 5.3 Teacher English Fluency: Certification of teacher fluency requirement in English and any other language used for instruction (Section 3116(c))

Element 6.1 Monitoring: The SEA conducts monitoring of its subgrantees sufficient to ensure compliance with Title III program requirements

Element 7.1 Parental Notification: Provisions for identification and placement and for not meeting the AMAOs; notification in an understandable format as required under Section 3302



Title III-related Amendments to State Consolidated Applications

State Amendment requests

ED approval letters



State Accountability Workbooks

5.4 The accountability system includes limited English proficient students.

5.5 The State has determined the minimum number of students sufficient to yield statistically reliable information for each purpose for which disaggregated data are used.



Study of State Implementation of No Child Left Behind (2006-07)

Development process of state ELP standards

First year current ELP standards were implemented

Anticipated changes to ELP standards

Development process of state ELP assessment

First year current ELP assessment(s) was/were implemented

ELP assessment - grade by grade or grade span (which spans)

Comment 1c. List of Extant Data Elements (continued)

Study of State Implementation of No Child Left Behind (2006-07) (continued)

Anticipated changes to ELP assessments

Native language assessments used for AYP purposes

Native language assessments - subjects

Native language assessments - languages

Native language assessments - grades

Data on accommodations for LEP students is collected and/or tracked

Application of HQT requirements to LIEP teachers

State considers ESL/ESOL to be core academic subject

Ways in which LIEP teachers demonstrate English language fluency

Ways in which LIEP teachers demonstrate fluency in languages other than English

State or district specifies requirements for teacher language fluency under Title III

Available endorsements/certifications available specifically for teachers of LEP students

Which of the available endorsements/certifications are required and for whom?

General TA provided to districts for issues relating to LEP students

How state determines how much support each district receives

State has specified ELD curriculum districts must use or give curricular options for districts

Development of 2006-07 AMAOs

Data used to develop AMAOs

Name of assessment used in AMAO development

Years of assessment data used in AMAO development

Processed used to validate AMAOs

State is working with external consultants to validate AMAOs

Year current AMAOs first implemented

2006-07 AMAOs different from 2002-03 AMAOs (describe)

Description of AMAO cohort

Kinds of districts for which AMAO calculations are based on 2005-06 testing

Comment 1c. List of Extant Data Elements (continued)

Study of State Implementation of No Child Left Behind (2006-07) (continued)

Type of districts reported back AMAO data to districts based on 2005-06 testing

State reports AMAO data to the public (describe)

Anticipate changes in AMAOs

Support for districts that didn't meet AMAOs; actions from the state

To whom does state apply actions and why?

Decision process for which actions are applied to which districts

Requirements/supports for districts who have missed AMAOs are identified for improvement under Title I

Description of TA provided to districts that haven't met AMAOs

Recipients of TA

Focus of TA



























Comment 2.g.iii.2 A.10. Revised Confidentiality Section

  1. Assurances of Confidentiality

As a research contractor, the research team is concerned with maintaining the confidentiality and security of its records. The team will ensure the confidentiality of the data to the extent possible through a variety of measures. The contractor’s project staff has extensive experience collecting information and maintaining confidentiality, security, and integrity of interview and survey data. The team has worked with the Institutional Review Board at American Institutes for Research to seek and receive approval of this study. The following confidentiality and data protection procedures will be in place:

Project team members will be educated about the confidentiality assurances given to respondents and to the sensitive nature of materials and data to be handled. Each person assigned to the study will be cautioned not to discuss confidential data.

Data from the case studies, state interviews, and subgrantee surveys will be treated as follows: respondents’ names and addresses will be disassociated from the data as they are entered into the database and will be used for data collection purposes only. As information is gathered from respondents or from sites, each will be assigned a unique identification number, which will be used for printout listings on which the data are displayed and analysis files. The unique identification number also will be used for data linkage. Data analysts will not be aware of any individual’s identity.

We will shred all interview protocols, forms, and other hardcopy documents containing identifiable data as soon as the need for this hard copy no longer exists. We will also destroy any data tapes or disks containing sensitive data.

Participants will be informed of the purposes of the data collection and the uses that may be made of the data collected. All case study respondents will be asked to sign an informed consent form (see drafts in appendices E and F). Consent forms will be collected from site visitors and stored in secure file cabinets at the contractor’s office in Washington, DC.

We will protect the confidentiality of district survey respondents and all district- and school-level respondents who provide data for the study and will assure them of confidentiality to the extent possible. We will ensure that no district- and school-level respondent names, schools, or districts are identified in reports or findings, and if necessary, we will mask distinguishing characteristics. Responses to this data collection will primarily be used to summarize findings in an aggregate manner (e.g., across types of districts) and secondarily to provide examples of program implementation in a manner that does not associate responses with a specific individual or site. We will not provide information that associates responses or findings with a district-level or school-level subject, or to a school or district to anyone outside of the study team except if required by law.

The case of state-level respondents is somewhat different. Our state-level data collections, by their very nature, focus on policy topics that are in the public domain. Moreover, it would not be difficult to identify Title III and assessment directors in each state and thus determine the identity of our state-level respondents. Having acknowledged that, we will endeavor to protect the privacy of our state-level interviewees, and as with district- and school-level respondents, we will avoid using their names in reports and attributing any quotes to specific individuals. We will primarily report on the numbers of states that engage in specific practices, thus avoiding reference to specific states.

While most of the information in the final report will be reported in aggregate form, as noted above, there may be instances where specific examples from the case study data will be utilized to illustrate “best practices”. In these instances, the identity of the case study site will be masked with a pseudonym and efforts will be made to mask distinguishing characteristics.

All electronic data will be protected using several methods. We will provide secure FTP services that allow encrypted transfer of large data files with clients. This added service prevents the need to break up large files into many smaller pieces, while providing a secure connection over the Internet. Our internal network is protected from unauthorized access utilizing defense-in-depth best practices, which incorporate firewalls and intrusion detection and prevention systems. The network is configured so that each user has a tailored set of rights, granted by the network administrator, to files approved for access and stored on the LAN. Access to our computer systems is password protected, and network passwords must be changed on regular basis and conform to our strong password policy. All project staff assigned to tasks involving sensitive data will be required to provide specific assurance of confidentiality and obtain any clearances that may be necessary. All staff will sign a statement attesting to the fact that they have read and understood the security plan and ED’s security directives. A copy of this statement is featured in Appendix B.

Comment 2.g.iv Revised Confidentiality Pledges

We have attached the electronic files for the following four sets of notification letters, introductory materials and consent forms, which now reflect the edits requested by OMB, especially regarding confidentiality pledges.

  • State Interview notification letters and introductory materials for Appendix C.docx

  • District Survey notification letter and introductory materials for Appendix D.docx

  • Case Study district interview notification letter introductory materials and consent form for Appendix E.docx

  • Case Study focus group introductory materials and consent form for Appendix F.docx



Comment 2.b.i. Revisions to Instruments Based on Pilot Testing

We made some small proposed revisions to the State Interview and Subgrantee Survey after we submitted the instruments to OMB. We have made these proposed revisions as a result of further internal review for readability and consistency as well as the data from the piloting. These proposed changes are discussed below. The proposed changes are also shown in “track changes” in attached files. See the attached documents labeled “App C State Interview Protocol 091109.doc” and “App D Subgrantee Survey Protocol 09 11 09.doc” for instruments that reflect the proposed revisions. We feel that these small, final changes will make these instruments and the data collected much improved. We have no final changes to propose to any of the case study instruments.

Explanations for Changes to the Title III State Interview and Subgrantee Survey Protocols

State Interview

We added items 4a,b,c in prefill document about AMAOs because the Consolidated State Performance Reports (CSPRs) turned out to not  be a viable extant data source.

We added item 15a about notice of AMAO performance in the interview because we cut it from the subgrantee survey.

Subgrantee Survey

In revisions our general principles were as follows:1) reduce length, 2) clarify instructions, 3) simplify wording and terminology, and 4) edit based on empirical data gained through piloting.

We have revised the contact materials based on the internal review for consistency and readability. We have also made a few small revisions based on OMB’s questions that we received on September 9th. The materials now include: (a) an initial mailed letter to sampled school districts explaining the purposes of the survey; (b) an e-mail describing who should complete and how to access the online version of the questionnaire; and (3) instructions in the front of the questionnaire on how to respond to questions and how to navigate and how to save and submit responses.

We have deleted a few items (references are to previous item numbers). We deleted item 1 because consortium status is now on the sampling frame. We deleted item 13 because respondents could not reliably respond in the pilot. We deleted item 15 because the question is now on the State Interview.

We have deleted portions of several items. In item 2, we deleted grade-specific detail to decrease response burden because this detail is not required by our analysis plan. In item 21, we deleted one column of questions because it is covered in (old) item 8. In item 23, we deleted one subitem because it was misunderstood and overlapped with other subitems. In item 31, we deleted the distinction between English and math because respondents said they were the same in almost all cases. It now reads “English language arts and/or mathematics”. We also deleted all “Other: specify” subitems, and integrated pilot test responses as subitems.

We added subitems based on further review of pilot test results to the following two items: item 11 subitem c. Services provided by other programs and item 28 subitem f. Teacher induction programs focusing on instruction of ELLs.

We edited terminology and formatting based on further review of pilot test results: In item 4, we replaced the term “immigrant” with “not born in the U.S.” because the term had different meanings to different pilot respondents. On item 8, we replaced the term “Periodic progress or benchmarking tests” with “Progress tests (also called “interim,” “benchmark,” or “diagnostic” tests)” and included a footnote definition. We also split item 8 into two parts to eliminate scrolling issues. On item 18, we reordered response options because all other items start with yes. On item 19, we added a response option because pilot respondents did not want to say no, and often made the changes discussed in this question for reasons other than alignment. We split item 20 into two items, one on students and one on teachers. We changed the response options on item 26 to clarify the meaning of yes and no in this question. In item 27, we added a “Not applicable” response option. Also we have changed items stating “Considering this year (2009-2010) and last year (2008-2009)” to “Since September 2008” to increase readability. We have also changed the item instructions to make them consistent across items.















Comment 3.a. Example of Bivariate Analysis

Exhibit 4-5
Percentage of Districts That Implemented Improvement Actions as a Direct Response to the District’s Title III AMAO status, by Number of LEP students in District, 2008-09 or 2009-2010

Improvement Action taken as a direct response to the district’s Title III AMAO status

Percentage of Districts by Number of LEP Students in District

1-500

LEP students

(n= )

501 or more LEP students

(n= )

All Districts

(n= )

Development of an improvement plan




New curriculum for English language development




New curriculum for content area instruction of LEP students




Increased time spent on English language development




Increased time spent on content area instruction for LEP students




Increased training for English language development teachers




Increased training for content area teachers of LEP students




Instructional specialist to assist English language development teachers




Instructional specialist to assist content area teachers of LEP students




Additional progress tests in English language




Additional progress tests in content area instruction of LEP students




Exhibit Reads: In districts with 500 or fewer LEP students, X percent of districts, reported developing a district improvement plan in 2008-09 or 2009-10 as a direct response to the district’s Title III AMAO status.

Source: Evaluation of State and Local Implementation of Title III, Subgrantee Survey, Item 18, 2009-10



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleResponses to OMB comments and questions on 200904-1875-002: Evaluation of State and Local Implementation of Title III Standards,
AuthorInformation Technology Group
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy