1875-NEW MEP evaluation OMB recruitment phase Submission -SS Part B

1875-NEW MEP evaluation OMB recruitment phase Submission -SS Part B.docx

Evaluation of the ESSA Title I, Part C, Migrant Education Programs (Recruitment phase)

OMB: 1875-0285

Document [docx]
Download: docx | pdf


=



Evaluation of the ESEA Title I—Part C Migrant Education Program Serving Children of Agricultural Workers and Fishers



Task 2.2: Draft OMB Package #1: Part B, Collections of information employing statistical methods




Contract GS-10F-0554N/BPA Order ED-PEP-16-A-0005/TO01


SRI Project P24149






Submitted to:

Joanne Bogart

Carlos Martinez

Policy and Program Studies Service

U.S. Department of Education

400 Maryland Avenue, SW

Washington, DC 20202






Prepared by:

SRI International

Deborah Jonas

Rebecca Schmidt

Jaunelle Pratt-Williams

Shari Golan


Policy Studies Associates

Leslie Anderson

Julie Meredith

Jackie MacFarlane



Exhibits






B. Collections of information employing statistical methods

1. Respondent universe and selection methods

The U.S. Department of Education (the Department) contracted with SRI International (SRI) and research partners Policy Studies Associates (PSA) and Arroyo Research Services (ARS) to administer two surveys and to carry out case studies at the state, regional/district, and school/project levels to evaluate the implementation of the Migrant Education Program (MEP).

As of 2014–15, 47 states received MEP grants, and those states in turn awarded subgrants to an estimated 813 regional and local service providers, who coordinate with local schools and program partners to deliver academic and support services to eligible migrant children and youth. Exhibit 1 provides the universe of state, regional, district, and school-level respondents, the number of respondents that will be selected to participate in each data collection activity, and the expected response rate

Exhibit 1. Universe of respondents and sample selection



Data collection activity



Universe of respondents


Sample selection

Expected response rate

Survey, State Directors of Migrant Education

47 grantees (estimated based on grants to SEAs awarded in 2015)

7 nongrantees (includes SEAs in non-grantee states, U.S. Territories, and Department of Defense schools and Bureau of Indian Education schools)

Universe (47 grantees)

> 90 percent

Survey, subgrantee program coordinators

813 subgrantee sample (estimated)

12,678 nonsubgrantee sample (estimated based on the number of school districts nationwide, 2013–14, Digest of Education Statistics Table 214.3)

Universe (813 subgrantees)

> 85 percent

Case study interviews, state staff

47 grantees (estimated based on grants to SEAs awarded in 2015)

7 nongrantees (includes SEAs in non-grantee states, U.S. Territories, and Department of Defense schools and Bureau of Indian Education schools)

10 sites for State Directors and other MEP staff (up to 4 respondents per site)



10 sites for Directors of other state-level agencies collaborating with the state MEP (up to 3 per site)

100 percent

Case study interviews, regional/district

813 (estimated)

20 sites selected (2 in each state, up to 3 per site)

100 percent

Case study interviews, school or project

98,224 based on total number of schools in the United States, Digest of Education Statistics, 2013–14, Table 216.20.

40 sites selected (2 in each district, up to 4 per site)

100 percent

2. Procedures for the collection of information

Different methods will be used to sample participants for survey and case study participation, as described in this section of the document.

Survey Sampling

The survey sampling plan includes all State Directors of Migrant Education and all regional and local MEP coordinators. Surveying the population of all current subgrantees (approximately 813 respondents) will enable the evaluation to:

  • Account for variation in subgrantee strategies within a state. Variations in program development, implementation, support activities, and time during the year when services are provided are likely to exist across states and within states. By including all subgrantees from all grantee states, the evaluation can account for variation in approaches that may be due to geographic concentrations of migrant student populations or related to certain times of the year when migrant students are present.

  • Be efficient. Surveying the universe of subgrantees will be more efficient than designing a sampling frame that accounts for the wide variation in the number of subgrantees (and the projects they support) by state. The number of projects across the states varies. For example, one state reported 2,886 projects in 2014–15, while another reported nine projects serving migrant students.

  • Achieve the smallest margin of error. Surveying the universe of state MEP directors and regional and local subgrant coordinators will yield the smallest margin of error, based on anticipated sample sizes, and given the expectation of an 85 percent response rate.

Case Study Sampling

The case study sample includes 10 states, two regional or local subgrantee sites in each state (20 total), and two schools or projects in each regional or local subgrantee site (40 total). The study will use a three-stage process to select case study sites. Sampling begins by selecting 10 MEP-funded states, followed by selecting two MEP subgrantees within each of those states, and finally selecting two schools or projects within each of the MEP subgrantee sites. At each step, selection will be based on strata that can impact program implementation.

State Grantee Sample. The framework for state MEP grantee selection will be based on three factors: the size of the eligible migrant student population, the percentage of eligible migrant students the state serves, and the percentage of students identified as Priority For Service (PFS) that the state serves. Once the study team identifies candidate states based on the overall framework, the final sample of 10 states will be selected to reflect variation within more specific program implementation factors.

Size of the eligible migrant population. Based on data available in ED Data Express, in 2014–15, the most recent year available, an estimated 332,335 migrant students in the United States were eligible for MEP-funded services and support in 47 states. Although the median eligible student population size by state was 1,658 students, the number of eligible students varied widely across states. Using natural breaks in the distribution of the number of eligible students by state, the study will select states based on the number of students in the eligible population in each of the following categories:

  • Small eligible migrant population: N < 500

  • Mid-size eligible migrant population: N = 500—1,499

  • Large eligible migrant population: N = 1,500—9,000

  • Very large eligible migrant population: N = > 9,000



Percentage of the eligible population receiving MEP-funded services. There is wide variation in the percentage of eligible migratory students that states serve with MEP funds, with states serving from 27 percent of eligible students to 100 percent of eligible students. This variation may be associated with differences in implementation approaches, such as the types of services states offer, availability of personnel, approaches to serving PFS-eligible students versus all students, etc. Therefore, the percentage of migrant students served is a critical state-level selection criterion.

Percentage of eligible migrant students identified as Priority for Service. Before the Every Student Succeeds Act (ESSA) became law in December 2015, migratory students and out-of-school youth were designated as Priority for Service (PFS) if they experienced an interrupted school year by a qualifying move and they were failing academically. ESSA expanded the PFS definition to include students and out-of-school youth who made a qualifying move within the previous one-year period—not just during the school year—and who were failing or at risk of failing, or dropped out of school. Analysis of extant data shows that the percentage of eligible PFS migratory students identified for services varies across and within MEP states serving populations of all size categories. For example, the range of eligible students identified as PFS across the states ranges from none of their students (zero) to 94 percent of eligible students. Furthermore, states with very large eligible populations, on average, identify a smaller percentage of eligible students as PFS. Therefore, the selection of MEP-funded states will account for the percentage of eligible students identified as PFS. This information will assist in the identification of factors associated with variation, such as states’ capacity to identify PFS students, approaches states take to identify these students, and/or differences in the needs of the migrant student populations they serve.

Exhibit 2 displays the sampling framework for selecting states to include in the case study.

Exhibit 2. Sampling framework to select states for case study participation

Migrant student population size

Percentage of eligible migrant student population served

Percentage of eligible migrant students identified as PFS

Total number of states

Small

1 High

1


Mid-size

1 Low

1 High

1 High

3


Large

1 Low

1 High

1 Low


3


Very large

1 High

1 Low

1 High

3


Total number of states

6

4

10






After identifying states that meet the sampling framework requirements displayed Exhibit 2, state selection will proceed to capture sample variation in the following programmatic elements:

Type of service provided. States regularly report the number and percentage of eligible students being served with MEP-funded instructional services, nonacademic support, and referral services. The final sample will reflect states with varying distributions of services in these categories.

Service delivery model. Programs can offer MEP-funded services in year-round programs, during the school year only, or during the summer only. The final sample will include states that offer services using different models to provide information on the factors that affect state decision making regarding the selection of service delivery models.

Percentage of out-of-school youth (OSY) identified and served. Among states with migrant OSY, the percentage of these students served in 2014–15 varied across states, ranging from zero to 100 percent. In addition, states that received MEP funding may not identify or serve OSY as part of the MEP. The final selection of states for case study participation will include consideration of this measure to understand factors that contribute to variation in OSY being identified and served.

Percentage of students receiving high school credit accrual services. The study team will consider including the percentage of students receiving credit accrual services as a state selection factor. For example, the percentage of students receiving high school credit accrual services varies across the 47 MEP-funded states, but it does not appear to be related to migrant population size. That is, the percentage of students receiving credit accrual services varies as much among the states with the smallest migrant student populations (from 0 percent in Wyoming to 25 percent in Maryland) as it does among states with the largest migrant student populations (from 2 percent in Florida to 31 percent in Oregon). In fact, only a handful of states are delivering credit accrual services to a sizable percentage of eligible migratory youth (Louisiana, Maryland, Montana, Oklahoma, Oregon, and Wisconsin), and only Wisconsin is serving the majority of eligible high school students and OSY.

State is a direct service provider. We anticipate that a handful of states are delivering services directly to eligible migrant students and will consider these states, once identified, among the candidates for inclusion in the case study sample.

The final step in selecting the sample of 10 case study states will be to seek nominations from the technical working group and from the Department. The purpose of this step is to identify states that have a reputation for engaging in innovative and promising practices and facilitate the likelihood that the 10-state sample reflects variation in these factors.

Local-Level Site Selection Criteria. There is currently no single comprehensive data source available to document the subgrantees receiving MEP funds in all 47 states. On the basis of expert knowledge and a review of selected state evaluation reports, we know that subgrantees vary across and within the 47 MEP-funded states on several characteristics, including grant size, grantee type, and service delivery model. On receipt of OMB clearance, the study team will collect information from State Directors of Migrant Education about their subgrantees, researchers will identify the distribution of subgrantees and define categories of subgrant size, subgrantee type, and service delivery model across the 47 MEP-funded states. The following describes the selection criteria in greater detail.



Subgrant size. Within states, there are differences in the size of grant awards to subgrantees. State decisions about the distribution of MEP funds may be driven by, for example, the size of the migrant population the local grantee serves, the capacity of the local grantee to identify and deliver services to the migrant student population, or the availability of local funding and support (e.g., through local community-based organizations and/or social service providers) to supplement local MEP-funded programming. In selecting subgrantees for case studies (two per state for each of 10 case study states), the study team will consider variation in grant size among subgrantees and the number of migrant students they serve relative to other subgrantees in the state. Accordingly, for the case study sample of 20 local MEP grantees, we recommend sampling to achieve a relatively even distribution of subgrantees by grant size category, with at least six grantees that have small grants, seven that have medium-size grants, and seven that have large grants.

Subgrantee type. Local MEP grantee types include local education agencies (LEAs), local operating agencies (LOAs), and regional education service providers (RESPs), the latter of which may deliver services directly to eligible migrant students or award MEP subgrants to LEAs and LOAs. The study team will select two subgrantees in each of the 10 case study states to reflect the within-state distribution of grantee types. For example, if two-thirds of the grantees in a given state are LEAs and one-third are LOAs, then researchers will select one LEA and one LOA for that state. Overall, the case study team will work to achieve a case study sample that reflects the distribution of grantee types across the 47 MEP-funded states.

Service delivery model. Subgrantees deliver services at different times during the year, and the site selection criteria will account for these differences at the local level. Specifically, within each of the 10 case study states, the study team will work with state MEP directors to select grantees for case study that serve students during the regular school year, during the summer/intersession term, or year-round. To the extent that local service delivery models vary, the study team will select two subgrantees that represent the distribution of service delivery models within a given state. Ideally, the full sample will include one local grantee within each grant size category (i.e., small, medium, and large) that represents one type of service delivery model (i.e., year-round services, summer/intersession services, regular school year, extended day).

Distribution and Selection of Subgrantee Case Study Sites. The information on the distribution of subgrantees by subgrant size, grantee type, and service delivery model informs the preliminary sampling framework (Exhibit 3). For example, we anticipate selecting one subgrantee type (LEA, LOA, RESP) for each MEP subgrant size category. However, because regional education service providers are unlikely to receive small MEP subgrants, we will not select local sites on this variable. Accordingly, the study team will select two small subgrantees, an LEA and an LOA. Next, we will select four small subgrantees, one of which delivers year-round services, one that delivers services during the summer or intersession period, one that delivers services during the school year, and one that delivers extended-day services to eligible migrant students. For the remaining categories of medium and large subgrantees, we will select three by subgrant type (one LEA, one LOA, and one RESP) and four by service delivery model (i.e., one year-round, one summer/intersession period, one school year, and one extended-day program).



Exhibit 3. Preliminary case study sampling framework for MEP subgrantees based on subgrant size, subgrantee type, and service delivery model

MEP subgrant size

Subgrantee type (i.e., LEA,

LOA, RESP)

Service delivery model (year-round, summer/intersession, school year, extended day)

Total number of subgrantees

Small

1 LEA

1 LOA


1 year-round

1 summer/intersession

1 school year

1 extended day

6

Medium

1 LEA

1 LOA

1 RESP

1 year-round

1 summer/intersession

1 school year

1 extended day

7

Large

1 LEA

1 LOA

1 RESP

1 year-round

1 summer/intersession

1 school year

1 extended day

7

Total number of subgrantees

9

11

20



Select for Programmatic Variation. Once the study team has confirmed the sampling framework and identified subgrantee sites that meet each criterion, selection will proceed by considering and seeking variation based on known programmatic elements. The extent to which the study team can consider these additional selection criteria, however, will depend entirely on the data states collect on their subgrantees and their willingness to make these data available to the study team. Additional programmatic elements include:

  • Type of service provided. The study team will seek to capture the variation in the distribution of MEP-funded services—including MEP-funded instruction, nonacademic support, and referral services—provided to eligible migrant students among the local sites selected for the case study sample.

  • Percentage of out-of-school youth (OSY) identified and served. The study team will consider the percentage of OSY identified and served as a site selection factor, particularly sites that identify no OSY, sites that serve no OSY, and sites that both identify and serve large percentages of OSY compared with other subgrantees within and across states.

  • Percentage of students receiving high school credit accrual services. The study team will consider including the percentage of students receiving credit accrual services as a site selection factor, particularly sites that are serving both large and small percentages of migrant students in grades 9–12 and those who are OSY.

Similar to the selection procedures described for the case study states, a final step in selecting the sample of 20 local sites for case study will be to seek nominations from State Directors of Migrant Education. The purpose of this step is to identify subgrantees that have a reputation for engaging in promising practices and include them in the sample.

School-Level Site Selection Criteria

For efficiency in site selection, the study team will work with the 20 local case study sites to each nominate two schools to participate in the case studies. The team will provide the local MEP coordinators with a list of selection criteria to use in nominating their schools, including (1) size of the migrant student population identified (e.g., not fewer than 25 students); (2) percentage of migrant students served (e.g., not less than 30 percent); (3) service delivery model (e.g., one providing year-round services, one providing summer/intersession only, one providing only regular school year services, and one providing all three models); (4) school level (e.g., two elementary schools serving migrant students, one middle school, and one high school); and (5) schools that have developed promising practices for serving migrant students. With each coordinator’s school nominations in hand, the study team will screen to identify two schools per site—40 schools total—that vary across key characteristics (based on the selection criteria) and are willing to participate in the study.

Exhibit 4 provides a summary of the sampling framework for selecting states and local- and school-level sites for case studies.

Exhibit 4. Sampling variables used in case study selection

  1. Select 10 MEP-funded states that vary by:

    1. Size of migrant student population eligible for MEP-funded services

    2. Percentage of migrant student population receiving MEP-funded services

    3. Percentage of eligible migrant students identified as Priority for Service (PFS)

    4. Programmatic variation and promising practices for serving migratory students



  1. Within each of the 10 states, select two subgrantees (20 total) based on:

    1. Type (RESP, LOA, LEA)

    2. Service delivery model (e.g., year-round, summer)

    3. Size of subgrant award

    4. Programmatic variation and promising practices for serving migratory students



  1. Within each of the 20 MEP subgrantee sites, select two schools or projects (40 total) from subgrantee nominations based on:

    1. Size of migrant student population identified and served

    2. Service delivery model (e.g., year-round, summer)

    3. School level (elementary, middle, high school)

    4. Promising practices for serving migrant students





3. Methods to maximize response rates and to deal with issues of nonresponse

There are multiple methods in place and planned to maximize response rates and to deal with issues of nonresponse.

The Office of Migrant Education (OME) has notified grantees that it is planning to carry out this study and is regularly providing updates about study progress. This involvement has provided OME with input from stakeholders on the study focus and design, which supports stakeholder buy-in and ultimately, can increase response rates from these important stakeholders.

To solicit participation from MEP administrators at the state, subgrantee, and school/project levels, the study team will engage in a two-step process. The first step is to provide direct notification of the study and plans for data collection to the relevant state administrators, including the chief state school officers and the State Directors of Migrant Education in all grantee states. The study team will mail letters from the U.S. Department of Education inviting State Directors of Migrant Education to participate in the study. This first notification will include: (a) a study description with a discussion of its importance, purposes, and products; (b) information on the data collection schedule and plans; (c) provisions for maintaining anonymity of participants and data security; (d) the organizations and staff involved in the study; and (e) the benefits to be derived from the study. The state notification will also request information about subgrantees in each state to permit local sampling, and explain that the study has received OMB clearance. The letters will include names, phone numbers, and email addresses of Department staff and study team members who are available to answer questions about the study. Within one week of sending these letters, a member of the study team will follow up with each State Director to facilitate collection of subgrantee contact information and to answer any questions about the study.

On finalizing the subgrantee case study sample, the study team will send letters from the U.S. Department of Education to district superintendents, local program leaders, and directors of regional education service agencies informing them of the study. Each letter will include: (a) a study description with a discussion of its importance, purposes, and products; (b) information on the data collection schedule and plans; (c) provisions for maintaining anonymity of participants and data security; (d) the organizations and staff involved in the study; and (e) the benefits to be derived from the study. The letters will also explain that the study received OMB clearance and IRB approval. The letters to the 20 subgrantees selected for participation in case studies will include details about the timing and requirements for participating in the case study data collection. Finally, the letters will include the names, telephone numbers, and email addresses of Department staff and study team members who are available to answer questions about the study.

Survey data collection and follow-up. One week after mailing the notification letters, the study team will begin survey data collection by sending emails to State Directors of Migrant Education and local MEP coordinators, inviting them to participate in the online survey via a unique link.

A week after survey launch, the study team will begin following up with nonrespondents by email. The email will remind nonrespondents of the survey due date and invite them to contact the survey administrator with any questions or concerns. The study team will continue following up with nonrespondents via email approximately once a week for three weeks. For persistent nonrespondents, the study team will follow up by telephone. If we are unable to generate a response, lead researchers, in consultation with the Department, will identify critical items in the survey and attempt to administer the survey by telephone. The study team will complete survey data collection by December 2017, with a goal of approximately 85 percent of both State Directors of Migrant Education and local MEP coordinators responding.

Case study collection and follow-up. During the survey data collection window, the study team will contact State Directors and local MEP coordinators to coordinate and schedule case study data collection activities. To maximize participation, the study team will work with state and local coordinators to develop a site-visiting schedule that maximizes the study team’s time with MEP staff while minimizing burden. In those instances where selected state or local staff cannot participate in in-person interviews, the study team will conduct interviews by telephone.

4. Tests of procedures or methods to be undertaken to minimize burden and improve utility

Survey Pilot Test

After refining the survey instruments using initial feedback from the Department and the study’s technical work group (TWG) on the first draft, the study team will pilot test the surveys with a small number of State Directors of Migrant Education and local MEP subgrantees nominated by OME and by ARS (i.e., no more than four to five State Directors of Migrant Education and a similar number of local MEP coordinators). An expert member of the study team will debrief each pilot test participant to verify that all questions are clear and are measuring the concepts the study intends. In addition, the pilot test will provide accurate information on the length of the survey, as well as inform decisions about fine-tuning, adding, and deleting questions.

Interview Protocol Pilot Testing

As part of the development process, all interview protocols will be piloted with fewer than 10 purposively selected individuals at both the state and local MEP grantee levels and revised on the basis of the testing. Once potential pilot respondents are identified and before using the protocols in the field, study team members will conduct the pilot interviews by using a think-aloud technique. Using this strategy, researchers will ask the interview questions, and pilot respondents will provide answers but also comment on any confusing, inappropriate, or leading questions. The study team will give special attention to the response options of the structured questions to help ensure they are clearly understood and provide options that reflect pilot respondents’ experiences. Researchers will take detailed notes throughout this process, consolidate the findings, and make corresponding changes to the protocols as appropriate.



5. Names and telephone numbers of individuals consulted on statistical aspects of the design and the names of the contractors who will actually collect or analyze the information for the agency

SRI is the contractor with primary responsibility for the MEP evaluation, in collaboration with PSA and ARS. Ms. Leslie Anderson is the Project Director and Dr. Deborah Jonas is the Deputy Project Director. Dr. Rebecca Schmidt will lead the survey analysis, and Mr. Derek Riley will lead the case study analysis, coordinating throughout to achieve accurate and comprehensive interpretation of study results. Dr. Harold Javitz will provide statistical consulting support throughout the project. Exhibit 5 lists the information requested for the staff responsible for collecting and analyzing the study data.

Exhibit 5. Staff responsible for collecting and analyzing study data

Name

Project role

Organization

Phone number

Leslie Anderson

Project Director

PSA

202-939-5327

Derek Riley

Senior Researcher

PSA

202-689-5195

Deborah Jonas

Deputy Project Director

SRI

703-524-2053

Rebecca Schmidt

Statistical sampling and Lead Data Analyst

SRI

703-247-8491

Harold Javitz

Statistical consulting

SRI

650-859-4084

Kirk Vandersall

Senior Researcher and MEP Advisor

ARS

888-742-8723





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDeborah Jonas
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy