Supporting Statement Part B Title IV-A SSAE Implementation Study 12-17-2021

Supporting Statement Part B Title IV-A SSAE Implementation Study 12-17-2021.docx

National Implementation Study of Student Support and Academic Enrichment Grants (Title IV, Part A)

OMB: 1850-0968

Document [docx]
Download: docx | pdf



Supporting Statement for

OMB Clearance Request


Part B


National Implementation Study of Student Support and Academic Enrichment Grants (Title IV, Part A)







December 2021










Table of Exhibits


Section B: Data Collection Procedures and Statistical Methods

Study Overview

The U.S. Department of Education (ED)’s Institute of Education Sciences (IES) requests clearance for data collection activities to inform the Student Support and Academic Enrichment grant program (Title IV, Part A of the Elementary and Secondary Education Act of 1965, amended by the Every Student Succeeds Act (ESSA)).

Title IV-A resulted from a consolidation of several programs as part of a congressional effort to allow more state and local decision-making about use of funds. Title IV-A encompasses three broad program priorities that are intended to improve students’ academic achievement by increasing districts’ capacity to (1) provide all students with a well-rounded education, (2) ensure that the school environment is conducive to learning, and (3) enhance and personalize learning through technology. The new law also requires districts to consult with stakeholders, distribute Title IV-A funds to high-need schools, and in certain instances, to conduct comprehensive needs assessments, and it encourages districts to use evidence from research to pick strategies to fund. This evaluation will develop a national picture of how states and districts are implementing this new program, particularly the ways in which it supports school systems as they seek to recover from the coronavirus pandemic during the 2021–2022 school year.

B.1. Respondent Universe and Sampling Methods

State Surveys. The study will administer an online survey of 52 state Title IV‑A coordinators, including representatives from Washington, DC and Puerto Rico.1 Using the reported response rates from surveys conducted in the study by Troppe et al., 2017, Implementation of Title I and Title II-A Program Initiatives: Results From 2013–14,2 as a guide, we are assuming a 100 percent response rate for the state survey.

District Surveys. The study will administer an online survey of district Title IV-A coordinators in 2022, with a possibility of a follow-up survey in 2024. The survey will be administered to a nationally representative sample of school districts that receive Title IV-A funds in the United States. Using the Troppe et al., 2017 study as a guide, we are conservatively assuming an 85 percent response rate for the district survey. The listing of districts that received Title IV-A funds in FY 20203 and the allocation levels received by each district was obtained from the Study of District and School Uses of Federal Education Funds. As part of a coordinated effort across several studies, Westat (under contract #ED-IES-11-C-0063 for the study of “Implementation of Title I and II-A Program Initiatives”) drew a stratified random sample of districts from the sampling frame. Sample weights will be created based on districts’ probabilities of selection and will subsequently be adjusted to account for survey non-response, such that the weighted estimates of population statistics (e.g., means, totals) from the sample of districts will be representative of the national population of districts that receive Title IV-A funds.

Information about the universe of districts for creating the sampling frame for each component were gathered from the following data sources:

  • School year (SY) 2019-20 (FY 2020) Common Core of Data (CCD)

  • CCD LEA Data Group 029 (LEA Directory)

  • School year (SY) 2020-21 (FY 2020) Title IV-A State Allocation Amounts

  • Study of District and School Uses of Federal Education Funds

To be included in the sample frame a district must:

  • Serve students in any of the grades from 1 to 12

  • Be open (i.e., must include one or more operational schools)

  • Have student counts that are greater than zero (based on enrollment in operational schools)

  • Be located in one of the 50 states, the District of Columbia, or Puerto Rico

  • Be a regular public school district or a charter district

  • Be listed in the Study of District and School Uses of Federal Education Funds as having received Title IV-A funds in FY 2020

The sampling strategy was created to balance four key objectives of the survey:

  1. to obtain a sample that would produce precise estimates of nationally representative population means and percentages of districts that received a Title IV-A allocation;

  2. to obtain a sample that would produce precise estimates of nationally representative population means and percentages from the subsample of districts that did not transfer all of their Title IV-A funds to other programs;

  3. to make comparisons between districts with Title IV-A allocation amounts greater than or less than $30,000 (since the statute has different requirements for those that are above this funding level);

  4. to have a wide distribution across district characteristics including urbanicity, poverty, census region, and district size.

The sampling frame included all districts that received Title IV-A funds. A stratified random sample of 1,135 was drawn from the sampling frame.4 We assume that we will obtain completed surveys from 85 percent or 965 districts.

The sampling strata was based on the Title IV-A allocation amounts. Specifically, districts were sampled from the following two strata:

  1. Title IV, Part A allocation amount less than $30,000

  2. Title IV, Part A allocation amount greater than or equal to $30,000



In order to maximize the power to detect difference between districts with allocation amounts above and below $30,000, the sample allocations were selected such that after accounting for non-response, and accounting for assumed proportions of districts that will not transfer all of their Title IV-A funds to other programs, about half of the sample will be in each of the two Title IV-A allocation amount strata (Exhibit 1).

Exhibit 1. Sampling Assumptions

Title IV

Strata

Number of Districts in Sampling Frame

Percent of Frame Districts in Sampling Frame

Number of Districts in Sample

Expected Response Rate

Expected Number with Complete Surveys

Expected Proportion Not Transfer-ring All Funds

Expected Number of Districts Not Transferring All Funds

Design Effect

Low

11,033

69.1%

682

85%

580

65%

377

0.966

High

4,930

30.9%

453

85%

385

94%

362

0.927

TOTAL

15,963

100.0%

1,135

85%

965


739

1.041



B.2. Procedures for the Collection of Information

B.2.1. Statistical methodology for stratification and sample selection.

Statistical methodology for stratification and sample selection

The sampling methodology is described under item B.1 above.

Estimation procedure

Analysis weights will be constructed to account for the stratified sampling design that will produce three different probabilities of selection, and to account for survey non-response.

Given the sampling design and the variation in sampling weights that will result from the sampling design and from adjusting the weights for non-response, the analyses of the weighted data will require statistical analysis software, such as SAS or Stata survey procedures, that can estimate standard errors accurately. Both software packages are designed to produce appropriate design-based parameter estimates, standard errors, confidence intervals, and design effects. The use of weights and an appropriate variance estimation method such as linearization or a replication method will account for varying sampling weights and stratification in the design and will produce accurate standard errors. Standard errors that reflect the design are necessary to indicate the precision of the national estimates and for statistical tests.

Degree of accuracy needed for the purpose described in the justification

Two types of analyses are planned. The first type of analysis will involve estimating descriptive statistics with confidence intervals for the full sample of districts. Descriptive statistics will be estimates of either population means for continuous measures or population percentages for binary or categorical measures. Precision is therefore addressed mainly in terms of 95 percent confidence interval half-widths measured in standard deviation units for continuous variables or measured in percentage points for categorical variables, with the latter based on estimated percentages of P=50 and P=20 percent.5

The second type of analysis will compare population means or population percentages between subgroups of particular policy interest. The primary comparisons will be between districts that did or did not receive Title IV-A allocations greater than $30,000. These analyses will include statistical tests for the differences between groups and will produce p-values that characterize the probability of observing a difference as extreme as the observed difference in the sample, if in fact the population difference was zero. The expected precision for these analyses are described as minimum detectable differences (MDDs) that the study will have 80 percent chance of detecting.

The expected sample size and confidence interval half-width for the descriptive analysis on the full sample are provided in the first row of Exhibit 2. For a continuous variable, the 95% confidence interval for the estimated population mean is expected to be plus or minus 0.06 standard deviation units. For a binary (yes/no) measure where 50 percent of districts have “yes” responses, the confidence interval around that percentage will be plus or minus 3.1 percentage points. For an estimated population percentage of 20 percent “yes” responses, the confidence interval will be plus or minus 2.5 percentage points.

Some of the analyses will be focused on the subset of districts that includes only districts that did not transfer all of their Title IV-A funds to other programs Given the assumptions in Exhibit 1 about the percent of districts that will have transferred all of their funds, this subset will be comprised of approximately 736 districts. The second row of Exhibit 2 shows the expected half-widths of 95 percent confidence intervals for the analyses of this subset of districts.

Exhibit 2. Expected Sample Sizes and Estimateda Precision for Full Sample and Subsample

Sample

Expected Number Sampled

Expected Number in Analytic Sample

95% Confidence Interval Half Width

Continuous measures

(SD Units) b

Binary measures

P=50%

(Percentage Points) c

Binary measures

P=20%

(Percentage Points) d

Full Sample

1,135

965

±0.06

±3.1

±2.5

Sub-sample: Districts that did not transfer all Title IV-A funds

1,135

739

±0.07

±3.6

±2.9

a Precision using the sample assumptions in Exhibit 1.

b Measured in standard deviation units.

c For an estimated percentage of 50%.

d For an estimated percentage of 20%.


For the full sample, the expected Minimum Detectable Difference (MDD) for comparisons between districts with Title IV-A allocations greater than or equal to $30,000 compared to those with lower allocation amounts are 0.18 standard deviation units for a continuous outcome, and 9 and 7 percentage points for binary outcomes with prevalence rates of 50 and 20 percent, respectively (Exhibit 3, top panel). MDDs for comparisons among the sub-sample of districts that did not transfer all of their Title IV-A funds to other programs are shown in the bottom panel of Exhibit 3.

Exhibit 3. Expected Minimum Detectable Differencesa for Subgroup Comparison

Groups Compared

Expected Number Sampled

Expected Number in Analytic Sample

Minimum Detectable Differences

Continuous Measures

(SD Units) b

Binary Measures

P=50%

(Percentage Points) c

Binary Measures

P=20%

(Percentage Points) d

Full sample: All districts

Districts with Title IV-A allocations >=$30,000 versus districts with Title IV-A allocations <$30,000

1,135

965

0.18

0.09

0.07

Sub-sample: Districts that did not transfer all Title IV-A funds to other programs

Districts with Title IV-A allocations >=$30,000 versus districts with Title IV-A allocations <$30,000

1,135

739

0.20

0.10

0.08

a Minimum Detectable Differences (MDDs) were estimates for 80 percent power and a two tailed test with alpha level of p=0.05 and using the sample assumptions in Exhibit 1.

b Measured in standard deviation units.

c For an estimated percentage of 50%.

d For an estimated percentage of 20%.

Unusual problems requiring specialized sampling procedures

We do not anticipate any unusual problems that require specialized sampling procedures.

Any use of periodic (less frequent than annual) data collection cycles to reduce burden

ED is considering administering the Title IV-A survey in both 2020 and 2022 in order to understand how implementation of this relatively new program changes over time.

B.2.2. Estimation procedure

Not applicable.

B.2.3. Degree of accuracy needed for the purpose described in the justification

The expected 85 percent survey response rate will lead to survey estimates with a high degree of accuracy even if non-response bias is present.

B.2.4. Unusual problems requiring specialized sampling procedures

Not applicable.

B.2.5. Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

ED is considering administering the district Title IV-A coordinator survey in both 2022 and 2024 in order to understand how implementation of this relatively new program changes over time.

B.3. Methods to Maximize Response Rates and Deal with Nonresponse

This is the first national implementation study of the Title IV-A program. As such, one challenge of this study will be communicating the importance of the data collection, motivating participation, and following up with initial non-respondents to achieve our target response rate of at least 85 percent. The survey will include language citing that participation is required of states under Section 8304(a)(6)(B) of the ESEA and of districts under Section 8306(a)(4) of the ESEA. Additionally, the web-based approach will allow us to easily identify non-respondents for follow-up contact to encourage participation and maximize response rates.

Specifically, we will use the following methods to encourage participation:

  • Presentation of study at state coordinator meeting. ED shared the goals, study components, benefits of participation, and timeline for the study during the December 2019 state coordinators meeting. ED and contractor staff were available to answer questions about the study.

  • Pre-study notification of state and district superintendents. ED will send letters to state and district superintendents introducing the Abt study team and informing them about the goals of the Title IV-A study. The letters will also include information on the timeline, provisions for maintaining anonymity of participants, and the benefits to be derived from the study. In addition, we will include the name of the ED contact and the Abt Study Director so that participants may ask questions about the study. The letters from ED will add credibility to the email solicitation that will come from Abt a week or so later in order to encourage participation and increase study response rates. The notification will also include a study fact sheet describing the goals of the study and the study timeline.

  • Pre-study notification from Abt to state and district Title IV-A coordinators. Abt will send letters to all state coordinators and the sample of district coordinators describing the study, providing an explanation of the data collection activities, and requesting their participation. We will also include a study fact sheet describing the goals of the study and the study timeline, as well as a contact name in case there are questions.

  • Survey request email from Abt to state and district Title IV-A coordinators. At the designated opening date, the study team will send an email message to respondents with a unique survey link, detailed instructions, the closing date, and project staff contact information. Detailed on-screen instructions will be included. Throughout the data collection cycle, the study will use an email address to ensure that potential respondents can quickly and easily obtain answers to questions or concerns.

  • Customized reminder emails. A customized reminder email will be sent to study participants approximately two weeks after the initial invitation to encourage anyone who has not yet participated and to thank those who completed the survey. Additional reminder emails will be sent to non-respondents weekly. A final reminder email will be sent to participants one week before the close of the survey to again remind those who have not yet participated.

Copies of informational letters to the state chief school officers, district superintendents, state and district Title IV-A coordinators are included in Appendix A.

Abt expects a survey response rate of at least 85 percent and will implement follow-up procedures with non-respondents to increase participation. If survey response is less than 85 percent, two types of analyses will be performed to assess the implications of non-response. First, the available characteristics of the districts that completed the surveys (e.g., region, allocation of more than $30,000, urbanicity, poverty level, district size) will be compared to the characteristics of those that did not. Second, a statistical test using these characteristics will be performed to predict the probability that a district responded to the survey request. If these analyses point to the possibility of non-response bias, sampling weights will be adjusted based on the observable baseline characteristics.

B.4. Test of Procedures and Methods to be Undertaken

Survey Pilot Testing. We refined the surveys based on initial feedback from the study’s technical work group (TWG) and public comments. We programmed the surveys and conducted the pilot testing. Abt followed a pre-determined script to ensure that all skip patterns and open-ended and multiple choice questions worked as intended. We fixed glitches that were identified through this process before piloting the surveys with six Title IV-A district coordinators who will not be included in the nationally representative sample and with four Title IV-A state coordinators. In order to avoid losing sample from the census of state coordinators, for those who completed a pilot survey, we will enter their data into the online form and ask them to review all existing responses for accuracy, so that we can make any necessary updates, and to complete any new items.

Pilot respondents were asked to provide feedback on the clarity and content of the questions, the usability of the survey, and the time it took them to complete the survey via cognitive telephone interviews directly following each respondent’s completion of the survey. As a result of pilot testing, survey items were revised as follows:

  • Questions that asked about the proportion of funds allocated or transferred were changed to dollar values to make the desired information clearer and easier to report;

  • Definitions of terms that were not clear, such as “Fiscal Year 2021” or “new activities,” were clarified;

  • Two new items were added to the state survey asking about waivers due to the coronavirus pandemic; and

  • A few new response options were added, and others were deleted to better account for programmatic concerns and areas of interest.



Pilot respondents did not express concern about the survey structure.  All surveys were completed within the estimated times.

B.5. Individuals Consulted on Statistical Aspects or Collecting or Analyzing Data

The lead contractor for collection and analysis of data in this study is Abt Associates, which is being supported by RMC. The following individuals were consulted on the statistical aspects of the study:

Name

Company

Project Role

Email

Cristofer Price

Abt

Director of Analysis

[email protected]

Lou Rizzo

Westat

Sampling/Surveying

[email protected]

Keith Rust

Westat

Sampling/Surveying

[email protected]

Patty Troppe

Westat

Sampling/Surveying

[email protected]

The following individuals will be responsible for the data collection and analysis:

Name

Company

Project Role

Email

Ellen Bobronnikov

Abt

Project Director

[email protected]

Allan Porowski

Abt

Senior Advisor

[email protected]

Cristofer Price

Abt

Director of Analysis

[email protected]

Radha Roy

Abt

Data Collection Lead

[email protected]

Diana Sharp

RMC

Senior Advisor

[email protected]

Paul Smokowski

RMC

Senior Advisor

[email protected]



1 Since the American Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa pool their Title IV-A funds as part of their consolidated budget, they will not be included in the study.

2 https://ies.ed.gov/ncee/pubs/20174014/

3 For a very small number of districts with an unknown allocation for FY 2020, we will use the most recent amount available (FY 2019).

4 The sample design also had the objective of being coordinated with the sample designs of two other IES implementation studies with district data collections in the 2021-22 school year. These are the IES Title III Study and the Implementation of Title I/II-A Program Initiatives. IES is coordinating the district sample designs across the three studies in an effort to minimize as much as possible the extent to which districts are sampled for more than one of the three studies.



5 The variance of a proportion is largest (and therefore confidence intervals are the widest) at P=50% and decreases as P approaches 0 or 100% for a given sample size.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorPorsha Cropper
File Modified0000-00-00
File Created2021-12-24

© 2024 OMB.report | Privacy Policy