RLIS_OMB Supporting Statement Part A (rev. 11-24-08)2

RLIS_OMB Supporting Statement Part A (rev. 11-24-08)2.doc

The Evaluation of the Implementation of the Rural and Low-Income Schools (RLIS) Program

OMB: 1875-0250

Document [doc]
Download: doc | pdf

Evaluation of the Implementation of the
Rural and Low-Income Schools (RLIS) Program
OMB SUPPORTING STATEMENT A




Evaluation of the Implementation of the
Rural and Low-Income Schools (RLIS) Program




Request for OMB Approval

Supporting Statement Part A



Version 1


Submitted June 2008






Submitted to: Submitted by:

U.S. Department of Education

Office of Planning, Evaluation and

Policy Development (OPEPD)

Policy and Program Studies Service (PPSS)

400 Maryland Avenue, SW

Washington, DC 20202


Contracting Officer’s Representative: Project Director:

Erica Lee Kay Magill, Ph.D.

U.S. Department of Education Berkeley Policy Associates

Office of Planning, Evaluation and (510) 465-7884 ext. 206

Policy Development (OPEPD)

Policy and Program Studies Service (PPSS)

(202) 260-1463

TABLE OF CONTENTS

Supporting Statement A: Study Justification

A-1. Circumstances That Make Collection of Data Necessary………………………………………..

1

A-2. Purposes and Use of the Data……………………………………………………………………

2

A-3. Use of Improved Information Technology To Reduce Burden…………………………………..

5

A-4. Efforts to Identify and Avoid Duplication………………………………………………………….

5

A-5. Sensitivity to Burden on Small Entities…………………………………………………………

5

A-6. Consequences to Federal Program or Policies if Data are not Collected………………….

6

A-7. Special Circumstances…………………………………………………………………………..

6

A-8. Solicitation of Public Comment and Outside Consultation……………………………………

6

A-9. Payments to Participants………………………………..……………………………………….

6

A-10. Assurances of Confidentiality………………………………………………………………….

6

A-11. Justifications for Questions of a Sensitive Nature ……………………………………………….

8

A-12. Estimate of Hourly Burden to Participants……………………….…………………………….

8

A-13. Estimate of Total Annual Cost Burden to Participants or Record-Keepers………….……

9

A-14. Estimate of Annualized Cost to Federal Government……………………..………………..

9

A-15. Program Changes or Adjustments…………………………………………………………….

11

A-16. Plans for Tabulation, Analysis, and Publication of Results…………………………………….

11

A-17. Display of the Expiration Date for OMB Approval…………………………….……………..

13

A-18. Exception to the Certification Statement………………………………………………….…..

13



Appendices


Appendix A. RLIS District Coordinator Interview Protocol


Appendix B. RLIS State Coordinator Survey


Appendix C. RLIS District Coordinator Survey


Appendix D. RLIS District Coordinator Interview Introduction letter/email


Appendix E. RLIS State and District Survey Mail/Email Letter


Appendix F. Federal Register Notice


Appendix G. Confidentiality Forms


Appendix H. RLIS State Coordinator Contact Information Request Letter



SUPPORTING STATEMENT A: Study Justification



Evaluation of the Implementation of the Rural and Low-Income Schools (RLIS) Program


A1. Circumstances that Make Collection of Data Necessary


The U.S. Department of Education’s Policy and Program Studies Service (PPSS) is evaluating the implementation of the Rural and Low-Income Schools (RLIS) Program. The evaluation will be conducted by Berkeley Policy Associates (BPA) and Learning Point Associates (LPA). This collection is necessary because GPRA reporting provides insufficient data to report to Congress or to improve program implementation. Additionally, the State administration of this program gives the program office little contact with the school districts that actually receive the funds.


The Rural Education Achievement Program (REAP), authorized under Title VI, Part B of the Elementary and Secondary Education Act of 1965, as amended by the No Child Left Behind Act of 2001, supports rural school districts that, because of their small student populations, receive relatively small amounts of the formula grant allocations and that may have difficulty accessing and successfully competing for other Federal funding due to resource limitations. Beginning in FY 2002, REAP funds were made available to the states with eligible school districts, under flexible rules. Rural districts that receive REAP grants from their respective states were given a great deal of leeway to choose their goals, priorities and approaches to improving the quality of instruction and student academic achievement.


There are two REAP programs, the Small, Rural School Achievement (SRSA) program and the Rural and Low-Income Schools (RLIS) program. REAP funding is awarded to State Education Agencies (SEAs), which in turn provide funding to eligible local education agencies (LEAs). To be eligible to receive SRSA program funds, an LEA must: (1) have a total average daily attendance (ADA) of less than 600 students or serve only schools that are located in counties that have a population density of fewer than 10 persons per square mile; and (2) serve only schools that either have a National Center for Education Statistics (NCES) locale code of 7 (rural) or 8 (rural near an urban area) or are located in an area of the state defined as rural by a governmental agency of the state. To be eligible for RLIS funds: (1) an LEA must not be eligible for an SRSA grant; (2) 20 percent or more of the children ages 5 through 17 years served by the LEA must be from families with incomes below the poverty line; and (3) all of the schools served by the LEA must have a locale code designation of 6 (small town), 7, or 8.1


This study will address the following research questions:


  1. What are the characteristics of the districts served in terms of rural location, poverty, race, etc.?

  2. What trends have emerged in Local Education Agency (LEA) achievement data? Specifically, what trends have emerged in district rankings on student achievement after receipt of RLIS funds?

  3. What progress have states made towards their goals for funds from RLIS?

  4. What are states’ priorities for districts applying for RLIS subgrants? How do states administer and monitor the program? What guidance and assistance do states provide? How do states enforce the statutory accountability provisions?

  5. What goals have districts identified for RLIS in their grant applications? What progress have districts made toward their goals? How have districts actually used RLIS funds?


The evaluation has a multi-component design that includes analysis of extant data at the state and district levels; interviews with staff from a sample of states and districts regarding RLIS goals, priorities, and uses of funds; analysis of additional documents obtained from the sampled states and districts; and an online survey of staff from all states receiving RLIS funding, as well as an online survey of a random sample of staff from districts receiving RLIS funding.


Currently, there is no mechanism for determining how individual states and districts use program funds and how these states and districts make progress toward their goals for the program. These surveys and interviews address this problem by obtaining both representative and in-depth data on districts and states.


A2. Purposes and Use of the Data


The purpose of the evaluation is to obtain information for:


  • Preparing the Biennial Report to Congress on the RLIS program (mandated by Section 6224(c) of Title VI, Part B of the Elementary and Secondary Education Act);

  • Providing information for the next Office and Management and Budget (OMB) Program Assessment Rating Tool (PART);

  • Providing context and greater depth of understanding when reporting on Government Performance and Results Act (GPRA) measures; and

  • Informing program management and improvement


Exhibit 1 shows the information that is needed in order to answer each of the evaluation questions, the sources of the data that will be used to address the questions, and the method(s) for collecting the needed data.



Exhibit 1. Summary of Evaluation Questions, Data Sources, and Methods


Evaluation Question

Information Needed

Data Source

Data Collection Method

1. What are the characteristics of the districts served in terms of rural location, poverty, race, etc.?

Demographic and other key characteristics of RLIS districts

Common Core of Data

  • Analysis of Extant Data

2. What trends have emerged in LEA achievement data? What trends have emerged in district rankings on student achievement after receipt of RLIS funds?

Trends in student achievement and district AYP status since receipt of RLIS funds

EDFacts/EDEN, NLSLSAD, and RLIS biennial report (for baseline)

  • Analysis of Extant Data

3. What progress have states made towards their goals for funds from RLIS?

State goals and priorities; progress that states have made toward achieving goals

Consolidated State Performance Reports and Applications

EDFacts/EDEN, NLSLSAD

Interviews with state staff

Online surveys with state staff

  • Analysis of Extant Data

  • Interviews with state staff *

  • Online survey

4. What are states’ priorities for districts applying for RLIS subgrants? How do states administer and monitor the program? What guidance and assistance do states provide? How do states enforce the statutory accountability provisions?

Program priorities and state guidance for district subgrant applications; state monitoring and compliance procedures for RLIS districts

Interviews with state and district staff

Documents collected from states

Online surveys with state and district staff

  • Interviews with state staff *

  • Interviews with district staff

  • Review of documents

  • Online survey

5. What goals have districts identified for RLIS in their grant applications? What progress have districts made toward their goals? How have districts actually used RLIS funds?

Districts’ goals and priorities; progress that districts have made toward achieving their RLIS goals; districts’ uses of RLIS funds; districts’ views of state practices on monitoring and assistance

Interviews with state and district staff

Documents collected from states and districts

Online surveys with state and district staff

  • Interviews with state staff *

  • Interviews with district staff

  • Review of documents

  • Online survey

* Nine state interviews were conducted in April-May 2008 as a pretest of the district-level interviews.

The evaluation has a multi-component design that includes analysis of extant data at the state and district levels; interviews with staff from a sample of states and districts regarding RLIS goals, priorities, and uses of funds; analysis of additional documents obtained from the sampled states and districts; and an online survey of staff from all states receiving RLIS funding, as well as an online survey of a random sample of staff from districts receiving RLIS funding.


To answer the first two evaluation questions, we will analyze extant achievement and demographic data in all RLIS states.


To supplement the analysis of the extant data, we will use interviews and state and district documents to provide a more detailed picture of progress toward RLIS goals in a sample of states and of RLIS-funded school districts within those states. The interviews with staff from a sample of nine states and 45 districts will provide focused data on RLIS goals, priorities, and uses of funds. Interviews and analysis of additional documents obtained from the sampled states and districts will also provide insights into what factors act as facilitators and barriers to meeting state RLIS goals. The interviews and documents will contribute to answering evaluation questions 3–5.


Additionally, we will administer an online survey to state staff in all 39 states that received RLIS funds in the 2007-08 school year. The survey of state staff will allow us to present a representative picture of the overall perceptions of progress and priorities at the state level and will contribute to answering evaluation questions 3 and 4. Finally, to fully address Question 5, we will survey a sample of 689 of the 1,247 districts that received RLIS funds in the 2007-08 school year. The survey of district staff will allow us to present a representative picture of how LEAs use program funds and their perceptions of the progress they make toward their goals for the program.


The qualitative and quantitative data to be obtained for the evaluation from the interviews and online surveys consist of:

  • Districts’ goals and priorities

  • Perceptions of progress that districts have made toward achieving their RLIS goals

  • Districts’ uses of RLIS funds

  • Districts’ views of state practices on monitoring and assistance

  • State goals and priorities

  • Perceptions of progress that states have made toward achieving their RLIS goals

  • State guidance for district subgrant applications

  • State monitoring and compliance procedures for RLIS districts



Exhibit 2 lists instruments for which we are requesting OMB approval.



Exhibit 2. Data Collection Instruments (See Appendices A-C and H)


Instrument

Respondent Group

Content

Mode of Administration

Time Needed

Timeline

District coordinator interview

RLIS District coordinators

Districts’ goals and priorities, progress that districts have made toward achieving their RLIS goals, districts’ uses of RLIS funds, districts’ views of state practices on monitoring and assistance

Telephone

60 minutes

January-March 2009

State coordinator letter

State coordinators

Request for contact information for districts in the interview and survey samples.

Email or Mail

15 minutes

January 2009

State coordinator survey

RLIS State coordinators

State goals and priorities, progress that states have made toward achieving goals, program priorities and state guidance for district subgrant applications, state monitoring and compliance procedures for RLIS districts

Online

20 minutes

March-April 2009

District coordinator survey

RLIS District coordinators

Districts’ goals and priorities, progress that districts have made toward achieving their RLIS goals, districts’ uses of RLIS funds, districts’ views of state practices on monitoring and assistance

Online

20 minutes

March-April 2009



A3. Use of Improved Information Technology to Reduce Burden


The respondents for the surveys should have ready access to technology; hence, most data collection will be conducted online. However, respondents will have the option of completing the survey through a telephone interview, should they not have access to an Internet connection or if they fail to respond via the Internet. A paper/mail-in survey is not going to be offered as an alternative, due to the increased costs that would incur.


A4. Efforts to Identify and Avoid Duplication


There are no national research efforts underway that are using interviews and surveys of state and district RLIS administrators to determine how to manage and improve the RLIS program. Currently no forums or other methods exist that would enable us to capture information systematically about the needs and concerns of administrators in the field (at both the state and district levels). This study will not collect information currently collected in grantee performance reports.


A5. Efforts to Minimize Burden on Small Businesses or Other Entities


Small businesses and other entities (e.g., schools) will not be responsible for this survey data collection, nor will their assistance be needed in any response or information collection. Respondents for these surveys and interviews, as listed in Exhibit 2, are individual employees of state departments of education or the public school system.


A6. Consequences to Federal Program if the Information is Not Collected or is Collected Less Frequently


In the absence of these surveys and interviews, it would be difficult for the U.S. Department of Education to provide complete information for the Biennial Report to Congress on the RLIS program, for the next OMB PART assessment, and on GPRA measures. The survey collection will only occur once, in early 2009.


A7. Special Circumstances


This information collection will not be conducted in a manner that will require using any special circumstances.


A8. Solicitation of Public Comment and Persons Consulted Outside the Agency


There were no public comments received as a result of the 60-day comment period.


Lead researchers for the evaluation have consulted on both the content and form of data collection with in-house (BPA and LPA) experts in survey design and sampling, education policy, and school and district support services.


A9. Payments to Respondents


No payments to respondents will be offered for this survey.


A10. Assurances of Confidentiality


BPA and LPA researchers and staff follow the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183), which conform to the requirements of the Privacy Act Section 552 of Title 5 of the United States Code covering the collection, maintenance, and disclosure of information from or about identifiable individuals.  We will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released.  Information from participating institutions and respondents will be presented only at aggregate levels in reports. Information on respondents, if linked to their institution, will not be linked to any individually identifiable information. No individually identifiable information will be maintained by the study team. All institution-level identifiable information will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required.


BPA obtains signed Affidavits of Nondisclosure and Confidentiality Agreements from all employees, subcontractors, and consultants that may have access to this data, and we can submit them to our PPSS COR.


All communication to survey recipients (invitations to participate and follow-up reminders) and the survey itself will include the following language assuring respondents of confidentiality:


Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district/state or individual. We will not provide information that identifies you or your district/state to anyone outside the study team, except as required by law.”


Copies of the confidentiality forms to be signed by BPA staff working on the study are provided in Appendix G.


BPA implements data security policies and programs. Below is an overview of BPA’s data security policy:


Policies for Class 1 Data (Confidential data, with identifying information) are:

(1) Can never leave BPA premises.

(2) Always kept in a secure place.

(3) Only authorized persons can access and use.

(4) Must be properly disposed of or transferred.


Exhibit 3 summarizes the procedures for handling Class 1 data.


Exhibit 3. Procedures for Handling Class 1 Data


Electronic Data

Paper Data

Receipt and tracking of Class 1 materials

  • Notify office manager if expecting to receive confidential data

  • Catalogue all data received

  • Catalogue all data received

  • Notify office manager if expecting to receive confidential data

Can never leave BPA premises

  • Must work on BPA premises with these data (working from home/during business trip is not permitted)

  • Must work on site at BPA with these data

Create separate working analysis file

  • Strip individual-identifying information for analysis files, which can then be stored in access-limited folders on BPA’s LAN

NA

Always kept in a secure place

  • On data server or in locked cabinet in locked server room (CD or other disk media)

  • Must not be left unattended in public view (e.g. on desk or screen)

  • May not be stored on laptop

  • Store in locked cabinet in a locked room

  • Must not be left in public view (on desk or in common-use areas)

Only authorized persons can access and use

  • Limit access to the data server by use of passwords

  • The minimum number of people who absolutely need to use the data should be given access

  • Key to locked cabinet to be kept securely by authorized persons

  • The minimum number of people who absolutely need to use the data should be given access

Must be properly disposed of or transferred

  • Update catalogue whenever data are disposed of or transferred

  • Mail data in a password protected and/or encrypted form on an unmarked diskette and CD

  • Require recipient and delivery verification.

  • If absolutely necessary to transfer via email or Internet, create encrypted, password-protected files; transmit password verbally (by phone). Do not include password in email!

  • Update catalogue whenever data are disposed of or transferred

  • When mailing, require recipient and delivery verification.

  • Shred any paper with confidential data before disposing



Policies for Class 2 (Proprietary data and documents that are not Class 1) are:

(1) Only authorized persons can access and use.

(2) Must be used and stored under responsible person's oversight. Must not be left in public view (e.g., sitting out on a desk, open on computer monitor).



A11. Justifications for Questions of a Sensitive Nature


The questions on the surveys and interviews do not address sensitive topics.


A12. Estimate of Hourly Burden to Participants


As indicated earlier, the survey and interview data collection will occur only once. Exhibit 4, next page, shows that the estimated annual/total respondent burden for this data collection is 258.75 hours.


In order to recruit the samples for the district interviews and survey we will need to request contact information for the districts in the samples from the state coordinators. We expect that this request should take no longer than 15 minutes to complete. Most state coordinators communicate regularly with their district coordinators and should have easy access to their contact information. See Appendix H for a copy of the letter and sample form we intend to use to request the contact information.


We used the state interviews as a pilot test for the district interviews. Based on this pilot test of the district interviews with 9 people in similar roles (state staff) to those that will be interviewed, the mean time to respond was 1 hour.


Based on a pilot test of the online surveys with less than 9 people in similar roles to those that will be surveyed, the mean time to respond was 12 minutes. Since, some Internet connections are slow and some surveys will be completed over the phone (taking more time), the estimate of 20 minutes represents a reasonable amount of time within which respondents should be able to complete the survey.



Exhibit 4. Respondent Hour Burden Estimate


Type of Respondent

Data Collection Activity

Hour Burden per Respondent (in hours)

Annual/Total Expected Number of Respondents

Annual/Total Hour Burden (in hours)

RLIS District coordinators

District coordinator interview

1

(60 minutes)

45

45

RLIS State coordinators

Request for district contact information

.25

(15 minutes)

35

8.75

RLIS State coordinators

State coordinator survey

.33

(20 minutes)

35

12

RLIS District coordinators

District coordinator survey

.33

(20 minutes)

586

193

TOTAL

------------------------------

----

701

258.75


Aside from their time to participate in the surveys, interviews, and request for contact information there are no direct costs to respondents.


The estimated annual/total cost burden for all data collection is presented in Exhibit 5 below.


Exhibit 5. Respondent Cost Burden Estimate


Type of Respondent

Data Collection Activity

Annual/Total Respondents

Annual/Total Hour Burden

Hourly Rate2

Annual/Total Cost Burden

RLIS District coordinators

District coordinator interview

45

45

$33.32

$1,499.40

RLIS State coordinators

Request for district contact information

35

8.75

$33.32

$291.55

RLIS State coordinators

State coordinator survey

35

12

$33.32

$ 399.84

RLIS District coordinators

District coordinator survey

586

193

$33.32

$ 6,430.76

TOTAL

----------------------

701

258.75

----------

$8,621.55



A13. Estimate of Total Annual Cost Burden to Participants or Record-Keepers


There are no start-up costs for this collection.


A14. Estimate of Annualized Cost to the Federal Government


The total budget for the Evaluation of the Implementation of the Rural and Low-Income Schools (RLIS) Program to be carried out by the contract to BPA is $211,344, which covers all project activities, including interviewing, survey design, programming, survey implementation, data processing, analysis, and reporting. The annual cost to the Federal Government is approximately $105,672 (approximately half of the total budget will be incurred in each of the two years of the contract for this evaluation). Exhibit 6 provides a more detailed breakdown of the budget.

Exhibit 6. Cost and number of allocated staff hours by study activity


Task #

Task Description

Cost

Number of Allocated Staff Hours

Task 1

Kickoff Meeting and Quarterly Meetings

$17,398

130

Task 2

Monthly Reports and Performance Management

$16,227

152

Task 3

Revise Evaluation and Analysis Plan

$9,409

96

Task 4

Interview Protocol and OMB Clearance Package

$10,037

100

Task 5

Data Collection




Telephone interviews

$5,500

62


Online survey (including programming and phone follow-up)

$14,222

182

Task 6

Analyze data and prepare reports




Telephone interviews

$30,517

232


Online survey (including data processing)

$15,700

120

Task 7

ED Briefing and Grantee Conference Presentations

$8,124

32

Subcontract 1

Subcontractor (LPA)

Task 1 – $7,033

Task 2 – $2,864

Task 3 – $2,404

Task 4 – $1,543

Task 5 – $10,975

Task 6 – $45,391

$70,210


Subcontract 2

Rural education consultant

Task 3 – $2,000

Task 4 – $2,000

Task 5 – $2,000

Task 6 – $7,000

Task 7 – $1,000

$14,000


TOTAL


$211,344

1,106


Total BPA costs for direct labor and fringe over the course of the project are approximately $80,283. The proposed rates are based on current staff rates and 2,080 hours, or 260 days, per year. The standard workday is eight hours. Based on leave accrual policies as stated in the employee manual, BPA calculates an average combined sick/vacation leave of twenty days, plus eleven paid holidays per year. The cost of this leave time is included in the fringe benefit rate (38%). BPA pays the following fringe benefits: F.I.C.A., Workers’ Compensation, Health and Welfare Insurance, Retirement, Holiday/sick/vacation, and Federal and State taxes.


Other direct costs include:


Telephone: Telephone usage is estimated at $0.40 per direct labor hour, which is based on the firm’s historical usage patterns.


Supplies, Subscriptions, Books: The cost of supplies, subscriptions, books and other similar direct charge materials is estimated at $0.25 per direct labor hour, which is based on the firm’s historical usage patterns.


Copier: Copier costs are estimated at $0.60 per direct labor hour, which is based on the firm’s historical usage patterns.


Postage/Mailing: Postage and mailing costs are estimated at $0.40 per direct labor hour, which is based on the firm’s historical usage patterns.


Computer: Computer usage costs are estimated at $1.00 per direct labor hour, which is based on the firm’s historical usage patterns.


As a long-term contractor to the federal government, BPA’s indirect rates are as follows:


Labor Overhead: 49% of personnel and fringe costs


General and Administrative: 14% of all contract costs including labor overhead with the following exception: General and Administrative is charged on only the first $25,000 per year for each subcontract under $200,000, and on the first $50,000 per year for each subcontract over $200,000.


A copy of our most recent negotiated rate agreement is available upon request.


A15. Program Changes or Adjustment


This is a new study and data collection.


A16. Plans for Tabulation and Reporting of Results


Exhibit 7. Survey and District Interview General Timeline


2009

2010

January-April

May-July

August-December

January

  • January-March—Conduct district interviews and collect additional district data


  • March-April—Administer online surveys of states and districts

  • Begin cleaning and preparing data from interviews and surveys for analysis


  • July 17—Outline of final report due

  • August 14—1st draft report due


  • September 11—2nd draft report due


  • October 9—3rd draft report due (Executive Secretary review)


  • November 6—4th draft of report due (OCO review and copy-editing)

  • January 22—Final report due


Our final report will include analysis of the data obtained from the interviews, document review, and online surveys, as well as analysis of the extant data on student achievement. Our approach for reporting results from the various methods of data collection will be to analyze the data obtained through each data collection method separately, and then to use the results of these analyses to provide support for our discussion of different aspects of implementation of the RLIS program.


By utilizing data gathered from the multiple and methodologically varied components of the study, we will be able to report findings that are empirically sound and well grounded in substantive knowledge of RLIS implementation at the state and district levels. Implementation data gathered through interviews, documents, and online surveys will be essential to interpreting the findings of the quantitative analysis of the extant student data. Analysis of implementation challenges will also inform future guidelines and oversight of the RLIS program.


The separate analyses (by method of data collection) will be:


  1. The data from the interviews with the sample of 45 school districts that received RLIS funds during the 2007-2008 school year will be analyzed in conjunction with background research, interview notes, and district documents, to provide a better understanding of district goals, priorities, and uses of funds. In order to complete the focused analysis of the interview data, we will abstract and code the qualitative data from these interviews in electronic format.


  1. For the analysis of the online surveys, we will summarize the distribution of responses to each item in the surveys. Descriptive statistics, such as percentages, medians, means, ranges, and standard deviations will be used to describe the distributions. The analysis will also examine the extent to which states or districts differ from each other.


To ensure that the evaluation findings are based on valid and reliable data, we will conduct a psychometric validation for the district and state online surveys. A psychometric validation allows evaluators to create scale scores on latent traits by evaluating all of the measurement properties of the instrument for construct and content validity. These scale scores, which are made up of multiple items that fit together from a theoretical perspective, provide a quantitative measure of the frequency and intensity of an individual’s responses. Scale scores allow the measurement of larger concepts, such as level and quality of implementation. This process will allow the evaluation team to model the relationship between programmatic features or implementation and the relevant outcomes of interest. By creating implementation scale scores (employing Rasch measurement techniques), we can gauge whether certain programmatic or attitudinal characteristics are influential in predicting positive program outcomes.


  1. Analysis of the extant data will include: the description of RLIS-funded districts and their use of RLIS funds; assessment of how well districts and states have achieved their goals; and comparison of student achievement in RLIS-funded districts with student achievement in non-RLIS-funded districts.


Qualitative data gathered in state interviews and documents will be integrated with quantitative methods in several ways:


  • Focused analysis will generate hypotheses about contextual and implementation factors that may influence student performance outcomes. For example, district interviews will address:

  • Emphases across the RLIS focus areas, and how these vary across districts

  • Difficulties faced by districts in implementing the program, and district characteristics associated with these problems.

  • Use of funds for specific types of activities

  • Perceptions of progress toward goals for RLIS funds


  • Findings from the online surveys will be used to refine the specifications of district and programmatic factors to be included in the models of student achievement for the analysis of the extant data. For example, the online surveys may be used to generate categories or typologies of program approaches, as well as scale scores that can be incorporated into the model as predictive variables.


  • Findings from the interviews, online surveys, and document reviews will be used to interpret and contextualize the findings of the analysis of student performance. Qualitative and attitudinal data can be particularly useful in helping explain how specific state and district practices in implementing the program may lead to progress or shortfalls in meeting student achievement goals, and how changes in the RLIS program guidelines might impact both practices and outcomes.



A17. Display of Expiration Date for OMB Approval


No request is being made for exemption from displaying the expiration date.


A18. Exceptions


We are able to certify compliance with each of the provisions. This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.

1 Office of Elementary and Secondary Education (OESE) Website, Rural Education Achievement Program (REAP), SRSA program eligibility (http://www.ed.gov/programs/reapsrsa/eligibility.html) and RLIS program eligibility (http://www.ed.gov/programs/reaprlisp/eligibility.html).

2 Based on data from the May 2007 National Occupational Employment and Wage estimates found at http://www.bls.gov.

File Typeapplication/msword
AuthorPhyllis Weinstock
Last Modified ByErica.Lee
File Modified2008-11-24
File Created2008-11-24

© 2024 OMB.report | Privacy Policy