Att_Revised_%20PREL_OMB_SupportStatement_PartA_Dec%205

Att_Revised_%20PREL_OMB_SupportStatement_PartA_Dec%205.pdf

Random Assignment Evaluation of Principles-Based Professional Development to Improve Reading Comprehension for English Language Learners.

OMB: 1850-0847

Document [pdf]
Download: pdf | pdf
Regional Education Laboratory (REL) Pacific:
Random Assignment of Evaluation of PrinciplesBased Professional Development to Improve
Reading Comprehension for English Language
Learners (Task 2)

OMB Supporting Statement, Part A
Contract No: ED 06 CO 0024

Prepared for
U.S. Department of Education
Office of the Deputy Secretary
Policy and Program Studies Service
400 Maryland Avenue SW
Washington, DC 20202

Prepared by
Pacific Resources for Education and Learning
900 Fort Street Mall, Suite 1300
Honolulu, Hawai'i 96813
Berkeley Policy Associates
440 Grand Ave, Suite 500
Oakland, CA 94610

June 20, 2007
(Revised December 5, 2007)

Table of Contents
Supporting Statement for the Paper Reduction Act
INTRODUCTION.......................................................................................................................................1
PART A: JUSTIFICATION
1.

Circumstances that Make Data Collection Necessary ...............................................................2

2.

The study Purposes and Uses of the Data ................................................................................5

3.

Use of Improved Technology to Reduce Burden .......................................................................5

4.

Efforts to Identify and Reduce Duplication.................................................................................5

5.

Efforts to Minimize Burden on Small Businesses ......................................................................5

6.

Consequences of not Collecting the Information .......................................................................6

7.

Special Circumstances...............................................................................................................6

8.

Federal Register Comments and Outside Consultants..............................................................6

9.

Payments to Respondents .........................................................................................................6

10.

Assurance of Confidentiality to Respondents ............................................................................6

11.

Justification for Questions of a Sensitive Nature .......................................................................8

12.

Estimate of Information Collection Burden.................................................................................8

13.

Estimate of Total Annual Cost Burden .....................................................................................10

14.

Estimate of Annual Cost to the Federal Government ..............................................................10

15.

Change in Annual Reporting Burden .......................................................................................11

16.

Plans for Tabulation and Publication .......................................................................................11

17.

OMB Expiration Date ...............................................................................................................14

18.

Exceptions to Certification Statement ......................................................................................14

References.........................................................................................................................................13
Appendix A: Features of Previous and Current Pacific CHILD Programs
Appendix B: Teacher Survey
Appendix C: Impact Survey
Appendix D: Principal Survey
Appendix E: Teacher Focus Group Discussion Guide
Appendix F: Teacher Interview Discussion Guide
Appendix G: PREL Staff Focus Group Guide
Appendix H: Professional Development Observation Protocol
Appendix I: SIOP Classroom Observation Protocol
Appendix J: Consent Forms/Information Sheet
Appendix K: IRB Approval Letter
Appendix L: Letters of Interest
Appendix M: Sample MOU
Appendix N: Data Security Policy and Procedures

List of Exhibits

Exhibit 1: Data Collection Instruments and Proposed Implementation Dates...........................................4
Exhibit 2: Yearly Burden Estimates...........................................................................................................9
Exhibit 3 Overview of Study Timeline.......................................................................................................13

Introduction
This submission is a request for approval of data collection instruments that will be used to support the
Evaluation of the Random Assignment of Principles-Based Professional Development to Improve
Reading Comprehension for English Language Learners (Pacific-CHILD). The Pacific Communities with
th
High-performance in Literacy Development (Pacific CHILD), is a professional development program for 4
th
and 5 grade teachers of secondary English Language Learners (ELL) in the Pacific region, developed by
Pacific Resources for Education and Learning (PREL) with funding from Regional Education LaboratoryPacific (REL-P). The Pacific CHILD program of professional development uses research-based
instructional strategies appropriate for schools across the Pacific region. The Pacific CHILD program
focuses on:

1. Using informational text to build reading comprehension skills.
2. Building the capacity of all students to use three reading comprehension strategies (vocabulary
acquisition; question generation; and text structure) to improve reading achievement.
3. Improving pedagogy with targeted classroom organization and management practices
(differentiated instruction).
4. Creating a format of instruction for 100% student engagement across the continua of reading
skills and English language proficiency (interactive learning).
5. Refining practice in and with existing reading/language arts curriculum and texts.
6. Standards-based instruction, with an emphasis on closing the achievement gap between
language minority and language majority students.
This study consists of two primary objectives. The first objective is to a) to determine the impact of the
Pacific CHILD professional development for teachers in terms of their content knowledge, self-efficacy,
and pedagogical skills, and b) to measure the impact of the Pacific CHILD professional development for
teachers on student reading achievement. The second primary objective is to examine the extent to which
schools and teachers receive the Pacific CHILD training and support as intended and the extent to which
the Pacific CHILD model is implemented as intended. This study will also serve to inform future program
improvement and replication of the Pacific CHILD program. Pacific Resources for Education and Learning
(PREL) and its subcontractor, Berkeley Policy Associates (BPA), are conducting this study for the Institute
of Education Sciences (IES) of the US Department of Education. While PREL delivers Pacific CHILD
professional development, BPA serves as an independent evaluator. All data collection activities related
to evaluation will be carried out by BPA staff.
The study adopts a cluster random assignment research design, in which the unit of random assignment
is the school. Approximately 50 elementary schools in Hawai'i, the Commonwealth of the Northern
Mariana Islands (CNMI), and American Samoa will be recruited for this study. The 50 schools selected
th
th
and their participating 4 and 5 grade teachers will be randomly assigned to treatment and control
conditions. Participating teachers in the 25 program schools will begin the two-year Pacific CHILD
professional development during the 2007-2008 school year in American Samoa and CNMI and in the
2008-2009 school year in Hawai’i. Professional development services will be available to control schools
and teachers after a 2-year embargo period in an effort to make participation in the study less of an
obvious burden for control schools, which otherwise might feel that they do not benefit from the study.
Data collection activities to support the study will begin only upon receipt of OMB approval.

1

Supporting Statement for the Paperwork Reduction Act
PART A: JUSTIFICATION

1.

Circumstances that Make Data Collection Necessary
This information collection is being conducted as one of the Task 2 Studies (Rigorous Applied Research
and Development) of the 2005-2010 Regional Education Laboratories Program. The current
authorization for the Regional Educational Laboratories program is under the Education Sciences Reform
Act of 2002, Part D, Section 174, administered by the Institute of Education Sciences’ National Center for
Education Evaluation and Regional Assistance. In addition, Title III of No Child Left Behind Act (NCLB) of
2002 calls for increased reform and accountability in the practice of instructing English language learners
(ELLs) and all students reading below grade level in all American schools. The legislation specifically
states that it is not acceptable for these students to continue to fall behind their peers. Closing existing
achievement gaps in reading, math, science, and other areas of study is the primary goal of federal
education policy. It is well-known that most schools need help achieving this goal, particularly in the
jurisdictions (state education agencies) of the Pacific region. Although the figures vary from one
geographic entity to the next, the challenges facing ELLs are pervasive and substantial across the region.
In most Pacific communities, classes are taught in English, which is a second or third language for many
students. In Hawai‘i, where—with the exception of the Hawai'ian immersion schools—all instruction is in
English, the number of ELL children in classrooms varies across the state, with an overall average of 7
percent. Instruction is also officially in English in the U.S. territories; however, the percentage of ELLs
there is dramatically higher. ELLs constitute 78 percent of students in CNMI. In American Samoa, all
students are ELLs—the mother tongue of students, teachers, and school administrators is generally
Samoan.
PREL will implement and evaluate the Pacific Communities with High-performance In Literacy
Development (Pacific CHILD) program, a professional development program for 4th and 5th grade
teachers who teach English language learners (ELLs) across the Pacific region. There are three reasons
PREL has selected this as a topic of investigation:
(a) Various Pacific jurisdictions state in their educational plans the need to improve the quality of
teachers in the areas of content knowledge and classroom instructional practices, especially as it
relates to the reading comprehension of ELLs;
(b) Under NCLB, schools in Hawai‘i, the Commonwealth of the Northern Mariana Islands (CNMI),
and American Samoa are held accountable for achievement in reading comprehension; and
(c) This will contribute to the research on successful strategies in professional development, building
on the previous work of PREL on the earlier version of Pacific CHILD which targeted 2nd and 3rd
grade teachers (see Appendix A for features of previous and current Pacific CHILD programs). A
study of the earlier version of Pacific CHILD has shown some promising results for both teacher
and student outcomes (Chesswas, et al. 2005).
The Pacific CHILD program addresses the needs of teachers of English language learners (ELLs) by
providing teachers with an intensive professional development designed to enhance their instructional
skills to develop reading comprehension of all students in their classroom, particularly among ELLs. This
professional development is a year round two-year program that combines intensive training sessions
with regular and on-going demonstrations and modeling in teachers’ classrooms, and weekly peer
support group meetings.

2

The intervention’s effectiveness will be assessed in approximately 50 schools in three jurisdictions in the
Pacific. Twenty-five of these schools will be randomly assigned to a treatment group, which will be eligible
to participate in Pacific CHILD for two years, and twenty-five will be assigned to a control group, which will
be excluded from Pacific CHILD for two years. It is expected that the overall study sample will have
approximately 270 teachers and approximately 6,600 students.
PREL’s evaluation of Pacific CHILD proposes to collect data under its two main components and
implementation and an outcome study.
Implementation Study
The primary purpose of the implementation study is to examine the extent to which schools and teachers
in the study sample receive the Pacific CHILD professional development as intended and the extent to
which the Pacific CHILD model is implemented with fidelity.
Key research questions regarding the implementation of Pacific CHILD include the following:
•
•
•
•
•
•
•
•
•

Is the Pacific CHILD program as implemented providing all the activities, materials, and services
planned?
Are the activities delivered in the originally planned format, sequence, and timeframe?
Are the appropriate personnel, training, supervision, and resources available at the right time and
place to achieve program objectives?
What are the characteristics of teachers participating in the program and their students?
What is the teachers’ level of exposure to program components?
What barriers to achieving program objectives emerge?
What is done to overcome these barriers?
How are emerging strengths and weakness addressed via midcourse corrections in program
design and implementation? What measures are taken to improve the program long-term?
What is the level of satisfaction and of impact on attitudes and intentions of participants? How do
these levels vary by subgroup (i.e., do teachers in different schools and/or entities report equal
levels of participation? of satisfaction? How does level of satisfaction vary by degree of program
exposure? by participant characteristics?)

These research questions will be addressed with data gathered from several implementation research
activities. The Teacher Survey will be used to measure teachers’ general experiences with professional
development and, for treatment teachers only, their experience with the Pacific CHILD program. Some
teacher background information will be also collected through this survey. The principal survey will be
used to collect information on school policies and practices in professional development and school
environment factors that might contribute to the implementation of a PD model. In addition, there will be
data collection through observations and focus groups that are related to teachers’ professional
development in Pacific CHILD program to monitor the fidelity of the model.
Outcome Study
The primary purpose of the outcome study is to test PREL's hypothesis that: (a) Pacific CHILD improves
teacher quality, in terms of their content knowledge, self-efficacy, and pedagogical skills, and (b) as
teacher quality improves, so will the reading comprehension of students.
Key research questions regarding the impact of Pacific CHILD include the following:
•

•

Do teachers who participate in Pacific CHILD demonstrate significant improvement in their
content knowledge of reading comprehension and self-efficacy as compared to teachers who do
not participate in Pacific CHILD?
Do teachers who participate in Pacific CHILD demonstrate significant improvement in their
classroom instructional skills for reading comprehension as compared to teachers who do not
participate in Pacific CHILD?

3

•

Do the students of teachers who participate in Pacific CHILD demonstrate improved academic
achievement in reading comprehension, as compared to students in schools whose teachers do
not participate in Pacific CHILD?

This study will measure outcomes in three different domains: (1) teacher pedagogical knowledge and
self-efficacy, (2) teacher practice and classroom environment, and (3) student achievement. The Impact
Survey will be used to assess teachers’ understanding of using appropriate pedagogy for ELLs. The
Teacher Survey (the same survey mentioned under the implementation study) will be used measure
teachers’ attitude and sense of self-efficacy. The teacher practice and classroom environment measures
will be collected with in-class observations. Student achievement will be measured using existing
standardized test score records. We will also collect available student information to be used as control
variables.
Most of these data collected will be analyzed by estimating hierarchical linear model, taking into account
the nested nature of the data (detailed analytical procedures are discussed in Section 16 of this
document). Qualitative data such as in-depth observations of program implementation and the classroom
environment will be analyzed using special software designed to identify and systematically describe
pertinent aspects of the implementation of the Pacific CHILD program to inform subsequent replication
and refinement.
Data Collection Instruments
Exhibit 1 below lists the instruments and the proposed dates for new data collection activities under the
implementation and outcome studies. These instruments are included as Appendix B-I.
Exhibit 1: Data Collection Instruments and Proposed Implementation Dates
Data Collection Instruments

Proposed Fielding Dates

OMB Approval
Required?

Teacher Survey (Appendix B)

Jan.- May 2008-2009 (AS & CNMI)
Jan.- May 2009-2010 (Hawai'i)

Yes

Impact Survey (Appendix C)

Jan.- May 2008-2009 (AS & CNMI)
Jan.- May 2009-2010 (Hawai'i)

Yes

Principal Survey (Appendix D)

Jan.- May 2007-2009 (AS & CNMI)
Jan.- May 2008-2010 (Hawai'i)

Yes

Teacher Focus Group/Interview
Discussion Guide (Appendix E-F)

Jan.- May 2007-2009 (AS & CNMI)
Jan.- May 2008-2010 (Hawai'i)

Yes

To be administered annually during the
treatment years.

No (Instrument provided for

PREL Staff Focus Group
Discussion Guide (Appendix G)
Professional Development
Observation Guide (Appendix H)
Classroom Observation Protocol
(Appendix I)

4

information only)

2.

The study Purposes and Uses of the Data
The evaluation includes a study of implementation of the Pacific CHILD professional development
program and a study of the intervention’s impact on teachers and students. IES will use information from
this study to assess the impact of Pacific CHILD on student reading comprehension in the Pacific. The
lessons from this study will inform pedagogical practices (teacher quality) that improve students reading
achievement in ELL contexts as well as mainstream classrooms. This study will not only affect future
policy decisions about curriculum and pedagogical practices, but will also be highly relevant to similar
efforts underway in the continental U.S., especially in remote, rural, and/or indigenous areas. The
implementation data will serve to inform future program improvement and replication, and will provide
documentation of the details of Pacific CHILD implementation for use by other institutions and entities that
plan to implement a similar intervention.

3.

Use of Improved Technology to Reduce Burden
Wherever possible, the study team will use current information technologies to maximize the efficiency
and completeness of the information needed for the study and to minimize the burden placed on
respondents. Web-based surveys may be considered in areas where respondents have access to
technology. If web-based surveys are used, teachers and principals will be provided with a login name
and password to protect their data. The web-based application will simplify completing the surveys, thus
reducing the burden on respondents. Paper and pencil surveys will be used in areas where respondents
are not expected to have access to technology or have difficulty utilizing available technology.

4.

Efforts to Identify and Reduce Duplication
This study represents the only known effort to implement a random assignment of a professional
development model to improve reading comprehension for English language learners in the Pacific
region. Random assignment is considered a preferred study approach for measuring the impact of an
intervention, as it allows researchers to make causal inferences with far more certainty than other
methods. In non-randomized studies, no matter how well a comparison group is constructed, it is not
possible to eliminate concerns for selection bias stemming from unobservable factors. By randomly
assigning the subjects into control and treatment groups, researchers can conclude that any difference in
outcome measures between the two groups is due to the intervention and not due to other factors.
PREL’s previous study of Pacific CHILD's design was not a random assignment study and it did not
specifically focus on reading comprehension. As the proposed research is one of few rigorous studies
employing randomized trials, it has the potential benefit of not only examining the impacts of the Pacific
CHILD program with more certainty but also demonstrating the feasibility of conducting a rigorous study
in the Pacific context.

5.

Efforts to Minimize Burden on Small Businesses
The primary entities for this study are teachers, principals, and students (located within schools). Burden
is reduced for all respondents by requesting only the minimum information required to meet the study
objectives.
Schools (or districts) will transmit electronic files of student achievement data to the researchers. Only
existing and necessary data (e.g., standardized test scores) will be requested from these entities, thereby
reducing the burden to schools/districts.

5

6.

Consequences of not Collecting the Information
This research effort is aligned with the mission of the Department of Education’s Institute of Education
Sciences (IES), which is to conduct rigorous research (based on a randomized controlled trial) that
supports the solution of educational problems in the United States. Thus, if this data is not collected, the
U.S. Department of Education, Congress, and other stakeholders will not have detailed information about
the effects of the Pacific CHILD professional development program on improving service to ELLs.
Moreover, if this data were collected less frequently, then there would be no sufficient documentation of
how Pacific CHILD was implemented in the schools, or the impact of the intervention on teachers’
instructional practices.

7.

Special Circumstances
None of the special circumstances, as listed in 5 Code of Federal Regulations Section 1320.5(d)(2), apply
to this study.

8.

Federal Register Comments and Outside Consultants
A notice about the study will be published in the Federal Register when the final OMB package is
submitted.
The data collection instruments were developed at Berkeley Policy Associates by a team under the
direction of Dr. Yasuyo Abe and Dr. Raquel Sanchez. Input was obtained from PREL staff members, Dr.
Roger Chesswas and Dr. Margaret Ho. During the course of this study we will draw on the experience
and expertise of a technical working group (TWG). The TWG is comprised of nationally renowned
research methodologists and content experts throughout the United States. TWG members include:
• Dr. Geoffrey Borman, University of Wisconsin-Madison
• Dr. Robert Boruch, University of Pennsylvania
• Dr. Daniel Brown, University of Hawai`i
• Dr. Thomas Cook, Northwestern University
• Dr. Margo Gottlieb, Illinois Resource Center
• Ms. Rosa, Salas Palomo, University of Guam
• Dr. Hiro Yoshikawa, Harvard University
• Dr. Shuquiang Zhang, University of Hawai`i

9.

Payments to Respondents
Respondents will not receive payments as incentives to participate in this study.

10. Assurance of Confidentiality to Respondents
PREL and BPA will follow the confidentiality and data protection requirements of IES (The Education
Sciences Reform Act of 2002, Title I, Part E, Section 183). Specifically, the Education Sciences Reform
Act of 2002, Title I Part E, Section 183 requires “All collection, maintenance, use and wide disemmination
of data by the Institute of Education Sciences “to conform with the requirements of section 522 of title 5,
United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and

6

445 of the General Education Provision Act (20 USC 1232g, 1232h). These citations refer to the Privacy
Act, the Family Educational Rights and Privacy Act, and the Protections of Pupil Rights Amendment.
We will protect the confidentiality of all information collected for the study and will use it for research
purposes only. Any information that is obtained in connection with this study and that can be identified
with any teachers, schools, or students will remain confidential and will be disclosed only with the
participants written permission or as required by law. Information from participating schools and
respondents will be presented at aggregate levels in reports. Information on teachers and students will be
linked to their school but not to any individually identifiable information.
In addition, participant surveys include the following text regarding confidentiality: “Your responses to this
survey will be used only for the research purposes. The results from this survey will be reported only in
an aggregated format, and your name or your school will not be revealed. We will not provide information
that identifies you or your school to anyone outside the study team, except as required by law”.
Personal identifying information (e.g., name, address, institutional ID numbers) will be used by the study
team only to link the different sources of data to each other. We will do this by creating a new ID number
that is unique to the study. All indivdulas will be identified by this study ID number. Once the study ID
number is created, the individually identifable information will be de-linked from the data. We will keep a
separate list that links the study ID to the individual in a secure location, accoridng to our strict dataprotection proptocols. The privacy of the information collected will be protected by keeping all paper data
in locked files (see Appendix N for data security policy and procedures). All computer records will be kept
in password-protected, secure storage under the direct control of the researcher team. All personidentifying identifiers will be destroyed when they are no longer required. We will obtain signed affidavits
of nondisclosure from all employees, subcontractors, and consultants that have access to the data
containing individual-identifying information. Participation in the program and in the research study is
completely voluntary. Volunteers may withdraw at any time and without consequences of any kind.
Informed consent will be obtained from teachers who will participate in the study. A copy of these consent
forms are included in Appendix J.

Information Sheets and Consent Forms
PREL will distribute information sheets and consent forms to each teacher during the recruitment
process. Signed consent forms will be submitted to BPA. The information sheets, consent forms, and
IRB approval are included in Appendix J and K.
Teachers
Each participating teacher will be provided with a consent form during recruitment. The consent form and
information sheet will address all aspects of the study, including random assignment of the treatment
group, confidentiality, participation in the focus groups and/or interviews, surveys, and classroom
observations. The content forms and information sheets clearly state that participation in the data
collection activities is voluntary. Teachers will be asked to sign the consent form once during the study
period. BPA researchers will also provide each teacher with a separate information sheet for each data
collection activity, detailing the specific procedures for that activity (see Appendix J).
Students
We will seek passive parent consent for accessing student records. The study will not directly contact
students or collect any information directly from students or their parents. We will utilize the existing data
maintained by the schools or districts. Passive parental consent will be sought for all students enrolled in
the participating schools as 4th or 5th graders during the study period.

7

11. Justification for Questions of a Sensitive Nature
No questions of a sensitive nature will be included in the teacher surveys or in the focus groups.

12. Estimate of Information Collection Burden
Estimates of the frequency and burden hours for each data collection activity are provided in Exhibit 2
below. Note that the number of years for data collection activities is three. During the first year of the
study, the data collection mainly takes place in American Samoa and CNMI. During the second year, it
will take place in all three jurisdictions. During the third year, it will takes place only in Hawai'i. Exhibit 2
shows estimated burden hours for which OMB approval is sought for each year.
In addition to the data collection activities listed in Exhibit 2, we plan to conduct focus group interviews
with PREL staff, collect existing standardized student achievement test results, and observe training
sessions, teachers' classroom practices, coaching sessions, and teacher peer support activities. The
participation in focus groups by PREL staff is considered as part of their implementation task and does
not create any additional burden. The collection of existing test data will not cause any additional burden
on students or teachers. Similarly, observations of persons during the normal course of their activities will
not cause burden to the respondents. Consequently, these additional activities planned for the proposed
evaluation are not presented in Exhibit 2.

Exhibit 2: Yearly Burden Estimates
Number of Respondents for the Entire Study Period
Respondent
Type

Sample Size

Number of Respondents

Frequency of Data
Collection Per Respondent

Instrument Administered

Teachers

256 (124 in Hawai'i
& 132 in AS/ CNMI)

205 (99 in Hawai'i &
106 in AS/CNMI, with a
response rate @ 80%)

3 per year

 Teacher Survey
 Impact Survey
 Teacher Focus Group/Interview

Principals

50 (24 in Hawai'i &
26 in AS/CNMI)

40 (19 in Hawaii & 21 in
AS/CNMI)

1 per year

 Principal Survey

State or SEA
administration

3 entities

3 entities

1 per year

 Administrative data transfer

TOTAL

306 persons (148 in
Hawai'i & 158 in
AS/CNMI), and 3
institutions

245 persons (118 in
Hawai'i & 127 in
AS/CNMI) and 3
institutions

8

Exhibit 2: Yearly Burden Estimates (Continued)
Burden Estimates: Study Year 1
Task

Sample
Size Per
Task

Expected
Response
rate

Teacher Survey

132

80%

Impact Survey

132

80%

Principal Survey

26

80%

Teacher Focus
Group/Interview

66

80%

106 teachers
(80% X 132
in sample) &
21 principals
(80% X 26 in
sample)

2

100%

2

Administrative
data transfer
TOTAL

Number of
Respondents
Per Year

Yearly Response
(Frequency of
Data Collection)

Total
Number of
Responses

Time per
Response
(in hours)

Total
Hours

Hourly
Rate*

Total
Cost
Burden

1 per year

106

0.50

52.8

$20

$1,056

1 per year

106

0.75

79.2

$20

$1,584

1 per year

21

0.33

6.9

$40

$277

1 per year

53

1.50

79.2

$20

$1,584

1 per year

2

4

8

$50

$400

127 persons
2 institutions

287

226.1

$4,901

Burden Estimates: Study Year 2
Task

Sample
Size Per
Task

Expected
Response
rate

Teacher Survey

256

80%

Impact Survey

256

80%

Principal Survey

50

80%

Teacher Focus
Group/Interview

128

80%

205 teachers
(80% X 256
in sample) &
40 principals
(80% X 26 in
sample)

3

100%

3

Administrative
data transfer
TOTAL

Number of
Respondents
Per Year

Yearly Response
(Frequency of
Data Collection)

Total
Number of
Responses

Time per
Response
(in hours)

Total
Hours

Hourly
Rate*

Total
Cost
Burden

1 per year

205

0.50

102.4

$20

$2,048

1 per year

205

0.75

153.6

$20

$3,072

1 per year

40

0.33

13.3

$40

$533

1 per year

102

1.50

153.6

$20

$3,072

1 per year

3

12

$50

$600

245 persons
3 institutions

4

555

434.9

$9,325

Burden Estimates: Study Year 3
Task

Sample
Size Per
Task

Expected
Response
rate

Teacher Survey

124

80%

Impact Survey

Number of
Respondents
Per Year

124

80%

Principal Survey

24

80%

Teacher Focus
Group/Interview

62

80%

99 teachers
(80% X 124
in sample) &
19 principals
(80% X 24 in
sample)

1

100%

3

Administrative
data transfer
TOTAL

Yearly Response
(Frequency of
Data Collection)

Total
Number of
Responses

Time per
Response
(in hours)

Total
Hours

Hourly
Rate*

Total
Cost
Burden

1 per year

99

0.50

49.6

$20

$992

1 per year

99

0.75

74.4

$20

$1,488

1 per year

19

0.33

6.4

$40

$256

1 per year

50

1.50

74.4

$20

$1,488

1 per year

3

4

12

$50

$600

118 persons
3 institutions

270

9

216.8

$4,824

Exhibit 2: Yearly Burden Estimates (Continued)
Burden Estimates: 2010-2011
Task

Administrative
data transfer
TOTAL

Sample
Size Per
Task

Expected
Response
rate

1

100%

Number of
Respondents
Per Year
1

Yearly Response
(Frequency of
Data Collection)

Total
Number of
Responses

1 per year

1 institution

1

Time per
Response
(in hours)

Total
Hours

Hourly
Rate*

Total
Cost
Burden

4

$50

$200

4

1

4

13. Estimate of Total Annual Cost Burden
There are no direct start-up costs to respondents other than their time to participate in the study, as
estimated above. Estimations of the value of participation time for each task are presented in Exhibit 2
above.

14. Estimate of Annual Cost to the Federal Government
The total cost to the federal government for the study, including the implementation of the intervention
program, is expected to total approximately $7.7 million over 5 years. The cost for the proposed random
assignment study (excluding the implementation of the intervention) in Year 1 of the contract (March
2006-April 2007) was about $560,000, which included costs for the redesign of the study, development of
data collection instruments, background data collection, site selection, training of local contractors, IRB
and OMB package preparation. The cost for the study in Year 2 (April 2007-March 2008) is estimated to
be about $488,000 which includes costs related to recruitment, baseline data collection, training of site
visitors and observers, the first-round implementation study data collection (surveys and site visits) in
American Samoa and CNMI. The cost for the study in Year 3 (April 2008-March 2009) is estimated to be
about $918,000, which includes costs related to baseline data collection in Hawai'i, the first-round
implementation in Hawai'i, the first-round follow-up data collection in American Samoa and CNMI, the
second-round implementation data collection in American Samoa and CNMI, and training of site visitors
and observers. The costs for the study in Year 4 (April 2009-March 2010) is estimated to be about
$1,025,000, which includes cost related to the first round follow-up data collection in Hawai'i, the second
round implementation data collection in Hawai'i, the second-round follow-up data collection in American
Samoa and CNMI, the student records collection, and the processing and analysis of the collected data.
The cost for the study in Year 5 (April 2010-March 2011) is estimated to be about $994,000, which
includes costs of the second round follow-up data collection for Hawai'i, the student data collection, data
analysis, writing, and preparing reports. (The cost will be re-estimated each year depending on the
progress of the project. The cost numbers presented here are only preliminary estimates and are
expected to be adjusted, with the total target budget for about $3.6 million for the study over 5 years.)

10

$200

15. Change in Annual Reporting Burden
This request is for a new information collection, with a reporting burden of 226.1hours for the 2007-2008
study year. (As indicated in Exhibit 2, an estimated annual reporting burden for the proposed study in
subsequent years is 434.9 hours in 2008-2009, 216.8 hours in 2009-2010 and 4 hours in 2010-2011).

16. Plans for Tabulation and Publication
Plans or Analysis and Estimation Procedures
The study will produce four types of analysis which are intended to describe: (a) the study sample and
study conditions, (b) the programs and their implementation, (c) the program effects, and (d) the
relationship between program implementation and program outcomes. Key features of these analyses are
summarized as follows.
Description of Context and Background
Even more than in education research conducted in the continental U.S., the correct interpretation of
research findings from the study will depend on a good understanding of the study’s political and cultural
context, the varied background characteristics of participating teachers and students, and the sometimes
extraordinary conditions under which professional development services will be delivered. To provide
such understanding, the study design includes extensive baseline and background data collection even
before random assignment of schools takes place. This data will be carefully analyzed to provide a
comprehensive description of the study’s context and to identify variation in study conditions and sample
composition that can subsequently be used to analyze variation in program effects across sites and
across subgroups of students or teachers. These baseline analyses will be both quantitative and
qualitative in nature, combining statistical data collected from teachers, schools, and local education
agencies (LEAs) with data garnered from qualitative observations and document review.
Description of Program Implementation
BPA will collect detailed qualitative and quantitative data on the implementation of the program. This data
will be used to monitor the fidelity of program implementation, and will also be analyzed to provide a
comprehensive description of the treatment as it was implemented on the ground. Such analyses serve
two primary purposes: (a) to convince the study audience that any program effects reflect a fair test of the
program as conceptualized, and (b) to provide a detailed program description that others can use to
replicate the program, both within and outside the Pacific region. Questions that will be addressed in the
implementation analysis include: How intensive were the services provided to schools and teachers? How
well did teachers take to the opportunities they were given? And, How much does implementing a
program like this cost per school, per teacher, or per student? The data collected from observations of
the intervention program activities, focus group with teachers and reading specialists, and surveys of
teacher and school administrator will provide the main input for the program implementation and fidelity
monitoring analyses.
Description of Program Impacts
The description of program impacts is the central objective of this random assignment study design. The
key outcome variables examined in the impact analysis are: teacher knowledge and self-efficacy,
classroom teaching skills, and student reading comprehension. All three outcomes will be tracked over
time to ascertain the impact the professional development implemented.

11

The random assignment study design assures that post-intervention differences between outcomes for
teachers and students in the program and control groups are unbiased estimates of the program effects.
However, there are significant benefits to using baseline covariates in the impact analysis, mostly in terms
of increased statistical power. Thus, we plan to use multiple regression models to analyze the outcome
data for this study. These models will control for student, school, and teacher background characteristics,
all collected prior to random assignment.
For covariates at the student level, PREL plans to use mostly school-level or grade-level covariates,
because individual-level student background data is not always available and may lead to observations
being dropped from the analysis due to missing data. For the purposes of increasing statistical power, it is
most important to control for student outcomes at the aggregate (school) level. School-level covariates
minimize random school-to-school variation in background characteristics between the program and
control group schools. To create these aggregate covariates, PREL will assess reading test outcomes
measured prior to random assignment. At the teacher level, PREL plans to control for teacher education
level and experience, as well as basic demographic characteristics, such as age, gender, and ethnicity. A
somewhat simplified format for a student-level impact regression model would appear as follows:

Yijk = β 0 + β1Pk + ∑ β z Z i + ∑ β x X j + ∑ β sSk + γ k + δ j + εi

(1)

In this model, Yijk represents outcome Y (a standardized reading comprehension test score, for example),
measured for student i with teacher j in school k. Pk is a program variable, which is measured at the
school level and has a value of 1 for program schools and 0 for control schools. β1 is the program effect
associated with this variable. Zi, Xj, and Sk are three vectors of control variables, for students i, teachers j,
and schools k, respectively. Each of these vectors is accompanied by a series of regression coefficients
(βz, βx, and βs). Separate error terms for schools, teachers, and students are represented by γk, δj, and εi,
respectively. Although Equation (1) as written appears to represent a fixed effects regression model,
PREL does not plan to estimate it that way, because doing so would not be appropriate given the
hierarchical nature of the data. Instead, the models will be estimated as a series of three nested
hierarchical models (at the school, teacher, and student levels), in which the unexplained error at one
level becomes the outcome to be explained at the next level. After estimating these regression models,
PREL will use the estimated coefficients to calculate regression-adjusted mean outcomes for program
and control schools. The regression-adjusted means will be presented in tables and figures so that
readers do not have to interpret regression coefficients to learn about the impacts of the program.

Reporting Plan
PREL will report findings based on the key study questions in technical and nontechnical reports. PREL
will prepare a technical report for U.S. ED/Institute of Education Sciences (IES) peer review and for
possible publication in peer reviewed journals, and make revisions based on the feedback. PREL will then
translate the technical report into a user-friendly, nontechnical version for dissemination across the region
to policymakers and educational practitioners, and to the state and federal resource centers. After the
approval of the technical report by the U.S. ED, PREL will submit the draft nontechnical report to the U.S.
ED for review and comment. Based on the U.S. ED’s comments on the draft nontechnical report, PREL
will submit a final non-technical report to the U.S. ED. Both reports will include a structured abstract and a
nontechnical stand-alone executive summary. PREL will provide the U.S. ED with an electronic copy of
the data collected for public use, along with an electronic codebook with information about the data file
structures, fields, and variable labels in each file. The reports and data to be shared with the U.S. ED will
be stripped of all names and information that could identify school, teacher or student. PREL will review
the risk of deductive disclosure and will exclude any information that would allow the identification of
school or individual. If exclusion of certain variables to protect the identify of the school or individual leads
to the loss of key information, we will provide appropriately aggregated data with respect to those
variables.

12

Study Timeline
The study’s time-table is as follows:
Exhibit 3: Overview of Planned Study Timeline
Finalize the Revised Task 2 Research Design
January 2007 (Completed)

Final
Instrument for Implementation Data Collection
Draft

January-March 2007 (Completed)

Pilot

April 2007 (Completed)

Revised Instruments

May 2007 (Completed)

Instruments for Outcomes Data Collection
Draft

January-March 2007 (Completed)

Pilot

April 2007 (Completed)

Revised Instruments

May 2007 (Completed)

OMB Approval
Submission to ED (for 60-day posting)

May 21, 2007 (Completed)

Publish 60-day Federal Register Notice

June 28, 2007 (Completed)

Submission to OMB (for 30-day posting)

September 4, 2007 (Completed)

Publish 30-day Federal Register Notice

September 15, 2007

Approval

October 30, 2007

IRB Approval
Conditional Approval

April 2007 (Completed)

Full Approval

June 2007

Technical Working Group Meetings

August 2007 (Completed), 2008, 2009, & 2010

Implementation of the Intervention (treatment group)

2007/08 – 2008/09 (AS and CNMI)
2008/09 – 2009/10 (Hawai'i)

Site Selection and Random Assignment
Sampling

2007/08 (Completed)

Recruitment

2007/08

Random Assignment

2007/08

Data Collection
Baseline Data Collection*

2007-2008

Implementation Study Data Collection

2007/08-2008/09 (AS and CNMI)
2008/09-2009/10 (Hawai’i)

Teacher Outcomes Data Collection

March –May 2008, March- May 2009

Collection of Existing Student Test Data

As they become available.

Reporting
Draft Technical Report

July 2010

Final Technical Report

December 2010

Draft Non-Technical Report

January 2011

Final Non-Technical Report

March 2011

Note: *Baseline data collection does not include any instruments that required OMB approval.

13

17. OMB Expiration Date
All data collection instruments will include the OMB expiration date.

18. Exceptions to Certification Statement
No exceptions are requested.

References:
Chesswas, R., Keir, S., Leung, E., & Terada, W. (2005). Evaluation of the outcomes and impact of the
Pacific CHILD professional development model. Honolulu, HI: Pacific Resources for Education and
Learning.
U.S. Bureau of Labor Statistics (2006) "U.S. Department of Labor: Occupational Employment Statistics."
May 2006 State Occupational Employment and Wage Estimates: Hawaii: Education, Training and Library
Occupations. Accessed 12 June 2007. http://www.bls.gov/oes/current/oes_HI.htm#b25-0000

14


File Typeapplication/pdf
File TitleMicrosoft Word - PREL_OMB_SupportStatement_PartA_Dec 5_07.doc
File Modified2007-12-06
File Created2007-12-06

© 2024 OMB.report | Privacy Policy