Supporting Statement Part A SFEC 3.8.22 clean

Supporting Statement Part A SFEC 3.8.22 clean.docx

Evaluation of the Implementation of the Statewide Family Engagement Centers

OMB: 1850-0971

Document [docx]
Download: docx | pdf

Shape1

Implementation of the Statewide Family Engagement Centers Program

Part A: Supporting Statement for Paperwork Reduction Act Submission

November 16, 2021

Tiffany Waits, Alina Martinez, and Diana McCallum



Submitted to:

Submitted by:

U.S. Department of Education

Institute of Education Sciences

550 12th Street, S.W.

Washington, DC 20202

Project Officer: Andrew Abrams
Contract Number: 91990020D0006

Mathematica

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005

Project Director: Alina Martinez
Reference Number: 51091



This page has been left blank for double-sided copying.





A. Justification

The U.S. Department of Education (the Department)’s Institute of Education Sciences (IES) requests clearance for new data collection activities to support an implementation study of the Statewide Family Engagement Centers (SFEC) program. This request covers primary data collection activities, including surveys and interviews of the directors of the 2018 SFEC grantees and representatives from their associated state education agency partners.

Family engagement is a key priority in helping the Department achieve its mission.1 The SFEC program, funded at approximately $10 million per year through 2023, is authorized under Title IV, Part E, of the Elementary and Secondary Education Act (ESEA) of 1965, as amended.2 The program builds on a previously funded program and requires grantees to help state and local education agencies, schools, community-based organizations, and primarily disadvantaged parents/families carry out their family engagement plans and take steps to close achievement gaps. State education agencies and sub-recipients are expected both to deliver services directly to families to increase their engagement and to provide technical assistance and training to district and school staff to help them help families. This study will describe the work of the first 12 grantees funded under the new program and assess the extent to which certain program priorities are being implemented.

A.1 Circumstances Necessitating the Collection of Information

This data collection on the SFECs is critical because it will help federal policymakers refine the goals and objectives of the SFEC program, as well as inform the work of education organizations and state and local education agencies beyond the current grantees to improve their work with families. Specifically, the results of this evaluation are intended to:

  1. Improve future SFEC grant competitions, by apprising Congress and the Department of the extent to which preferred emphases for family engagement (as reflected in both changes in statute and the most recent Notice Inviting Applications) are being carried out;

  2. Help the Department guide SFECs to better align services to state education agencies’ needs;

  3. Provide lessons learned from the experience with the COVID-19 pandemic including potential new strategies for family engagement;

Identify specific family engagement strategies that are commonly used or promising enough to rigorously test in future evaluations in order to expand the evidence based on effective approaches.

A.2 Purpose and Use of the Information Collection

IES has contracted with Mathematica and its partner Manhattan Strategy Group to conduct a descriptive study to better understand SFEC program implementation. Data collection activities will allow the study to address key research questions of policy interest (Table A1).

Shape2

Table A1. Research questions

Research question 1: To what extent do grantee activities reflect key program objectives, including direct services to LEAs and families?

1.1. What common activities do grantees report providing?

1.2. Do these sets of related activities reflect an emphasis on direct service, as Congress signaled? Is this emphasis more common among certain types of grantees?

1.3. Are the common topical areas grantees emphasize in their family engagement activities aligned with the other program priorities (i.e., family literacy, educational choice, evidence-based direct services, and simultaneous capacity-building between schools and homes)? Do the priorities under which grantees applied play a large role in which topics get most attention?

1.4. What do grantees report as the major changes they made in planned activities and topic areas as a result of COVID? Is there a sense these changes moved grantees closer to or further from program priorities?

1.5. Did the SFECs increase the capacity of SEAs to implement and sustain family engagement activities? Specifically, did SEAs report increased capacity across their state to implement direct services? Did SEAs report learning about family engagement activities that could be implemented across the state?

1.6. Did SEAs report alignment and support from the SFEC on the pressing state family engagement priorities?

Research question 2: What factors do grantees find most important in deciding which activities to provide?

2.1 Which key factors do grantees find most influential? Why? What factors contributed to greater SFEC and SEA collaboration?

Research question 3: To what extent do grantees focus on serving disadvantaged populations?

3.1 Are grantees directing most of their efforts to serving districts and schools with large numbers of priority populations?

3.2 To what extent are outreach and services/activities directed toward different types of disadvantaged families and high-need schools and communities?

3.3 Has the focus on directing most services toward disadvantaged populations changed as a result of COVID? How?

Research question 4: What are grantees’ key challenges in meeting the objectives of the grants?

4.1. What challenges have grantees faced providing technical assistance?

4.2. What challenges have grantees encountered providing direct services?

4.3. How has COVID challenged grant implementation? Do grantees intend to incorporate any lessons learned into their post-COVID business as usual? Did SFECs provide support that expanded SEA capacity to reach LEAs and families in greatest need during COVID?

4.4. What strategies have grantees used to overcome identified challenges, both ongoing and those related specifically to COVID? What supports do grantees need to overcome remaining challenges?



To obtain information to address the research questions, new data will be collected primarily through surveys and follow-up interviews with the 12 SFEC grantee project directors and a representative from each of the 13 state education agencies that partner with the SFECs (with each SFEC serving one state and one SFEC serving two states). Interviews are necessary, in addition to surveys, in order to obtain details about process and factors motivating their decisions, which cannot be obtained through surveys. The SEA surveys will be completed by 1 respondent at each SEA. The key SEA respondent may identify up to 2 other staff members to be interviewed, for a maximum of 39 total SEA interview respondents, if the state’s work with the SFEC is dispersed across offices or groups within the agency such that interviewing only one respondent would provide an incomplete picture of implementation from the state agency perspective.

SFECs are in the fourth year of their five-year grants, covering the 2018-2019 through 2021-2022 school years. The data collected will largely focus on the most recent year as that is a time period that respondents will most readily recall, with some questions probing on changes over years. so data will be collected on implementation in the first 4 years of the grant. Table A2 indicates the primary and administrative data to be collected, including the intended use of the data, the respondent sample, and the mode and timing of the collection activities.



Shape3

Table A2. Data sources, uses in study, respondent sample, and timing

Data source

Use in the study

Respondent

Mode and timing of data collection

Primary data




SFEC director survey

To identify:

  • Program implementation, including the specific services and activities implemented (RQ 1.1, 1.3)

  • Each SFEC’s percent of funds spent on direct services or technical assistance (RQ 1.2, 1.4)

  • Factors in SFECs’ decisions about which services and activities to provide, including program competitive preference priorities, existing evidence, stakeholder discussions, and needs assessments (RQ 2.1)

  • Populations served (RQ 3.2, 3.3)

  • Challenges grantees experienced (e.g., as a result of the COVID-19 pandemic) and their response to these (RQ 4.1, 4.2, 4.3, 4.4)

12 SFEC directors (one for each grantee)

Self-administered paper-and-pencil (PAPI) survey.

May 2022

SFEC director interview

To understand:

  • Reasons why grantees decided to emphasize direct services or technical assistance (RQ 2.1)

  • Why certain factors were or were not a strong influence on decisions about the services grantees implemented (RQ 2.1)

  • Reasons grantees funding of activities did or did not align with their stated priorities (RQ 1.2, 2.1)

  • Barriers to serving disadvantaged schools and families (RQ 4.1, 4.2)

  • How grantees overcame challenges over the course of the grant (RQ 4.3, 4.4)

12 SFEC directors (one for each grantee)

Phone interview with open-ended questions and prompts that build on the survey responses

June 2022

SEA representative survey

To identify:

  • SFEC increase of the state capacity to implement and sustain family engagement activities (RQ 1.5)

  • Receipt of SFEC services that increased the learning of the state about families’ needs across the state and engagement of appropriate stakeholders (RQ 1.5)

  • Responsiveness of the SFEC grantee team to support urgent challenges in family engagement across the state or to align with needs addressed by extant state family engagement activities (RQ 1.6)

13 SEA representatives (one at each of the 13 states that work with the SFECs)

Web-based survey with closed-ended questions

June 2022

SEA representative(s) interview

To understand:

  • How consultations with SFECs were able to increase the capacity of SEAs to implement and sustain family engagement activities (RQ 1.5)

  • How SFECs increased SEAs’ ability to reach families during the pandemic (RQ 1.5)

  • Reasons for challenges in working with the SFECs; Satisfaction with SFEC services above and beyond current family engagement activities and strategies (RQ 1.5)

Up to 39 SEA representatives (at least one and up to three for each of the 13 states that work with the SFECs)

Phone interview with open-ended questions and prompts that build on the survey responses

July 2022

Administrative data

Grantee Annual Performance Reports (APRs) a

To prepare SFEC surveys and interviews by identifying:

  • Types of activities that grantees were engaging in during the pandemic (including those that may be planned for recovery)

  • Challenges SFECs faced in response to the COVID-19 pandemic

None

Administrative Data

Quarterly call notes

To prepare SFEC surveys and interviews by identifying:

  • Current activities that provide background for the SFEC director and SEA interviews

  • Challenges SFECs In are facing in response to the COVID-19 pandemic and lessons learned in early recovery (fall 2021)

None

Administrative Data

EDFacts (includes Common Core of Data)a 2019–2020, for TBD number of districts and schools served by SFEC granteesb

To describe:

  • Demographic and performance characteristics of students in the districts and/or schools served by SFECs from 2019-2020 (RQ 3.1)

None (administrative data from National Center for Education Statistics, U.S. Department of Education)

Administrative data

a Because administrative data are provided directly by ED to the study team, they are not included in this clearance request. By the time data collection will take place, grant applications from 2018 will be available, as will Annual Performance Reports (APRs), for Years 1-2 (2018-2019 through 2019-2020) and quarterly call notes (notes from progress calls between the SFEC Program Director and each SFEC) for Years 1-3 (2018-2019 through 2020-2021).

b The number of districts and schools will be determined after review of districts and schools listed in the APRs.



A.3 Use of Information Technology and Burden Reduction

The data collection plan is designed to obtain information in an efficient way that minimizes respondent burden, including the use of technology when appropriate. The data collection efforts begin by using existing administrative data, to ensure the survey and interview questions are informed by data already provided by the SFECs and SEAs. Other strategies for minimizing burden are discussed under A.4. Efforts to Identify Duplication

A.3.1 Administrative data

To minimize burden on the SFECs and SEAs, the study team will use data either already collected by the program office (for example, grantee applications and the technical review of those applications, annual reports, quarterly notes) or as part of existing IES data collection activities (EDFacts). A description of each administrative data source is included below.

  • Grantee applications were submitted by grantees in 2018 in response to the program solicitation. We will extract select information from grantee applications, including competitive preference priorities and grantee discussion about the use of evidence to guide program activities.

  • Application technical reviews contain reviewer ratings and comments on the applications that were submitted. The study team will extract reviewers’ assessments of the evidence that grantees cite for their proposed activities as well as their assessment of the extent to which the applications address the Department’s competitive preference priorities.

  • Annual performance reports (APRs) are submitted by grantees each December. The study team will review the most recent round of APRs that span SFEC activities conducted from 2020 to 2021 to primarily extract information about grantee challenges. The APRs will serve as sources of program implementation and will inform the development of the semi-structured grantee interview protocols that are described above. APRs for 2021 will contain information about the implementation of grantees’ activities and challenges they faced. They will also provide the names of LEAs and schools that the SFECs are servicing.

  • Quarterly call notes are provided by ED on an ongoing basis. The study team will review the call notes from 2020 and 2021 to reference current SFEC activities and have an informed sense of their work prior to conducting interviews. The quarterly call notes serve as an up-to-date source of current program implementation.

  • EDFacts contains the CCD, a national database that contains basic information on public elementary and secondary schools and LEAs (or districts) will be used to identify background characteristics about the population of students in the schools that SFECs serve. The study team will obtain lists of the schools and districts that SFECs work intensively with from the APRs and will match these to the CCD to determine the characteristics of districts, students and families targeted by each grantee because conducting outreach to low-income students and parents is a program requirement.

A.3.2 SFEC director survey and SEA representative survey

The SFEC director survey (Instrument 1: SFEC Director Survey) will be administered through a paper survey instrument, an appropriate mode suited to the small sample size of 12 respondents. Paper is proposed instead of a web survey because the instrument contains complex formatting, which if proposed as a web survey would require extensive programming and testing to ensure correct functionality prior to data collection. The associated costs make a paper mode, with a follow-up telephone call to clarify missing data or illogical or conflicting responses, the most cost-efficient approach to data collection for the size of the study. The data entry system for handling completed paper surveys will include built-in editing checks and each survey will be assessed for completeness, accuracy of skips, and potential response errors. Any discrepancies will be resolved during the follow-up telephone interview. Respondents may also complete the surveys by phone if they prefer.

The SEA representative survey (Instrument 3: SEA Representative Survey) will be a web-based instrument. The web mode for this survey is appropriate, despite the small sample size of 13, because the instrument is short and straightforward and could easily be programmed in a web survey program. The web-based SEA representative survey will include any necessary programming for editing checks, such as automated skips, logic checks, and soft or hard checks for missing responses.

A.3.3 SFEC director interviews and SEA representative(s) interviews

The SFEC director interviews and the SEA representative interviews will be conducted at times most convenient for the respondents. The interview protocols are designed to reduce burden by focusing questions driven off of the responses provided in the respective surveys, which will be administered and completed prior to the conduct of the interviews. This strategy uses the information already collected to dive deeper into specific topics of interest, allowing participants to provide in-depth information, context, and explanation to their survey responses. The interviews will be conducted over a video-conferencing platform, and recorded—so long as the interviewees provide consent. This will ensure that notes are accurate and ready for analysis and will reduce the likelihood that the study team will need to follow up with respondents.

Interview Procedures. The SFEC Director interview and SEA respondent(s) interview will be conducted by a researcher as the lead interviewer and an analyst or associate as the note taker. Ahead of the SFEC interviews, the interview team will review survey data from the SFEC, information from the pre-administration SFEC funding questions, and the applicable SFEC APR, quarterly call notes, and grant application. This review and interviewer training will provide the study team with insights into the SFEC’s work with the SEA and their collaborations with districts, schools, community-based organizations, and parents/families. The study team will pre-populate the interview protocol with relevant information from the survey to reduce the burden on the respondent and to ensure interviewers address information that is not already available. The interviewer will also assess the completed paper survey for any missing data, inconsistent responses, or other potential data errors, and follow-up with the respondent to clarify and correct those during the telephone interview.

Pretesting the Interview. The study team will pretest the four instruments (SFEC Director Survey and Interview and SEA Survey and Interview) with two SFEC Directors and a primary respondent from the same SEA (not necessarily the partner of the SFEC being piloted, at the discretion of Mathematica) and include additional questions at the end to collect feedback about interview length, clarity, and relevance. Based on feedback from the pretest, the study team will make adjustments to the instruments to ensure questions are clear, relevant, and non-duplicative, thus reducing burden on respondents and improving data quality. The respondents that pilot the interview protocols will not be interviewed again during the study data collection period.

A.4 Efforts to Identify Duplication

Wherever possible, the study team plans to use administrative data to address the study research questions, including use of the grantee applications, annual performance reports, quarterly call notes provided to ED, and EdFacts data.

Information obtained from the surveys and interviews are highly specific to the SFEC grant program and not found in the administrative data used for this study nor available elsewhere. The study team will identify, survey, and interview a single respondent from each SFEC. This respondent will identify the person most knowledgeable about their work at the SEA. If multiple contacts at the SEA are identified, the study team will request that a primary respondent be identified for the survey. Up to three SEA representatives would be interviewed if necessary. These methods will help to minimize duplication in data collection activities.

No similar evaluations are being conducted, and there is no equivalent source for the information to be collected.

A.5 Efforts to Minimize Burden in Small Businesses

There are no small businesses among the respondents.

A.6 Consequences of Not Collecting the Information

This study and the data collection plan are key to informing the Department about both specific improvements to the SFEC program and key efforts to engage disadvantaged families during the pandemic recovery. In addition, the program may lead to the discovery of potentially promising practices in improving family engagement that could be rigorously tested for efficacy in a future evaluation. Without the data collection effort, the Department could not meet its obligation to provide information on the Congressionally authorized program (Title IV, Part E, of ESEA) to help improve families’ engagement in education.

A.7 Special Circumstances Justifying Inconsistences with Guidelines in 5CFR 1320.6

There are no special circumstances. All survey and interview respondents will be given more than 30 days to respond to the data collection requests.

A.8 Federal Register and Consultation Outside the Agency

A.8.1 Federal Register announcement

A 60-day notice to solicit public comments was published in the Federal Register, Volume 86, No. 225, page 67455, on November 26, 2021. During the notice and comment period, three comments were received, which are attached.

The 30-day notice will be published to solicit additional public comments.

A.8.2 Consultations outside the agency

In formulating the design for this evaluation, the study team sought input from a technical working group on June 14, 2021, comprising individuals with expertise in family–school partnerships, early literacy and family literacy, educational choice, family backgrounds and populations, and implementation research. This input will help ensure the study is of the highest quality and that findings are useful to federal policymakers working to refine the goals and objectives of the SFEC program. Table A3 lists the individuals who are serving on the technical working group, their affiliation, and their relevant expertise.

Shape4

Table A3. Technical working group members, their affiliation, and relevant expertise

Name

Affiliation

Area of expertise


Family–school partnershipsa

Early literacy and family literacy

Educational choice

Family backgrounds and populations

Implementation research

Meg Benner

The Learning Agency Lab

E





Vito Borello

National Association for Family, School, and Community Engagement

EC, E

X




Judy Carson

Connecticut Department of Education

EC, E, S

X




Kristin Duppong Hurley

University of Nebraska-Lincoln

E, S



Families of children with disabilities

X

Jane Groff

Former director of the Kansas Parent Information Resource Center (PIRC)

EC, E, S

X




Nancy Hill

Harvard University

E, S, P

X

X

Black and Latino families


Tracy Hill

Cleveland Metropolitan School District

E, S

X

X



Patricia Manz

Lehigh University

EC, E

X


Black and Latino families, English learners


a Indicates educational levels of partnerships: EC = early childhood; E = elementary; S = secondary; P = postsecondary.

A.9 Payments or Gifts to Respondents

No payments or gifts will be offered for this study.

A.10 Assurances of Confidentiality

Primary data collected for this study are at the grantee level and will not involve collecting or storing personally identifiable information. The Department makes no pledge about confidentiality of the data and no assurance of confidentiality is provided to respondents. Mathematica will conduct all data collection activities for this study in accordance with relevant regulations and requirements, which are:

  • The Privacy Act of 1974, P.L. 93-579 (5 U.S.C. 552a)

  • The Education Sciences Reform Act of 2002, Title I, Part E, Section 183

The research team will protect the confidentiality of all data collected for the study and will use it for research purposes only. All electronic data will be stored in Mathematica’s secure restricted folders, to which only approved project team members have access. Hard copy survey data will be entered into the Viking data system and web survey data collected on this project will be stored in the QuestionPro platform and transferred to Mathematica’s secure restricted folders for analysis. All hard copy surveys will be kept in secured locations and will be destroyed as soon as they are no longer required. The only PII available to the study team will be the contact names, emails, mailing addresses, and work telephone numbers of the selected study participants., which will be used for contacting and communicating with the participants during the data collection period. The surveys will not contain PII, other than the contact name for the SEA representative, which will be collected on the SFEC director survey.

All respondent materials, including contact emails, letters, and the data collection instruments contain a notice of confidentiality. All members of the study team with access to the data will be trained and certified on the importance of confidentiality and data security. Mathematica routinely uses the following safeguards to maintain data confidentiality and will apply them consistently throughout this study:

  • All Mathematica employees must sign a confidentiality pledge that emphasizes the importance of confidentiality and describes employees’ obligations to maintain it

  • Personally identifiable information is maintained on separate forms and files, which are linked only by random, study-specific identification numbers.

  • Access to hard-copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.

  • Access to computer data files is protected by secure usernames and passwords, which are only available to specific users who have a need to access the data and who have the appropriate security clearances.

  • Sensitive data are encrypted and stored on removable storage devices that are kept physically secure when not in use.

Mathematica’s standard for maintaining confidentiality includes training staff on the meaning of confidentiality, particularly as it relates to handling requests for information, and providing assurance to respondents about the protection of their responses. It also includes built-in safeguards on status monitoring and receipt control systems. In addition, all study staff who have access to confidential data must obtain security clearance from the Department, which requires completing personnel security forms, providing fingerprints, and undergoing a background check.

During data analysis, all names are replaced with identification numbers. All study reports will present data in aggregate form only; no survey or interview participants will be identifiable to the data they provided. Any quotations used in public reporting will be edited to ensure the identity of the respondent cannot be ascertained.

A.11 Questions of a Sensitive Nature

No questions of a sensitive nature will be included in this study.

A.12 Estimate of Response Burden

Tables A4a and A4b provide an estimate of burden for the data collections included in the current request, broken down by instrument and respondent. These estimates are based on our prior experience collecting data from grantee directors and state offices and actual time recorded during pretesting of each instrument.

The total annual respondent burden for the data collection effort covered by this clearance request is 29 hours (total burden of 72.25 divided by 2.5 study year). The following assumptions informed the primary data collection burden estimates:

  • SFEC director survey

  • One respondent for each of the 12 grantees

  • The survey will take approximately 130 minutes. The cost to the SFEC director is based on a national average hourly wage of $77.49 per hour in 2020 for general and operations managers (Bureau of Labor Statistics 2021a).

  • SFEC director interview

  • One respondent for each of the 12 grantees

  • The interview will take approximately 1.5 hours. The cost to the SFEC director is based on an average hourly wage of $77.49 per hour in 2020 for general and operations managers (Bureau of Labor Statistics 2021a).

  • SEA representative survey

  • One primary respondent at the SEA for each of the 13 states

  • The survey will take approximately 30 minutes. The cost to the SEA is based on an average hourly wage of $45.11 per hour in 2020 for education administrators (Bureau of Labor Statistics 2021b).

  • SEA representative interview

  • One primary respondent with up to two additional respondents at the SEA for each of the 13 states

  • The interview will take approximately 45 minutes. The cost to the SEA is based on an average hourly wage of $45.11 per hour in 2020 for education administrators (Bureau of Labor Statistics 2021b).



Shape5

Table A4A. Estimate of respondent time burden by year

Information activity

Time per response (hours)

Maximum number of responses

Maximum expected number of respondents

Maximum total time burden (hours)

SFEC director survey (spring 2022)

2.17

1

12

26

SFEC director interviews (spring 2022)

1.5

1

12

18

SEA representative survey (spring 2022)

0.5

1

10

5

SEA representative interviews (spring 2022)

1

0.75

31

23.25

Total hours



65

72.25

Shape6

Table A4B. Estimate of respondent cost burden by year

Information activity

Sample size

Respondent response rate (%)

Expected number of respondents

Expected number of responses

Average burden hours per response

Total annual burden hours

Average hourly wage

Total cost for responses (hourly wage* total burden hours)

SFEC director survey (May 2022)

12

100

12

12

2.17

26

$77.49a

$2,014.74

SFEC director interviews (June 2022)

12

100

12

12

1.5

18

$77.49a

$1,394.82

SEA representative survey (June 2022)

13

80

10

10

0.5

5

$45.11b

$225.55

SEA representative interviews (July 2022)

39

80

31

31

0.75

23.25

$45.11b

$1,048.81

Total across all years



65

65


72.25


$4,683.92

Average cost per year








$1,873.57

a The cost to the SFEC director is based on the national average hourly wage of $36.13 in 2020 for Social and Community Service Managers (BLS, 2021a). https://www.bls.gov/oes/current/oes119151.htm

b The cost to the SEA representative is based on the national average hourly wage of $77.49 in 2020 for General and Operations Managers (BLS, 2021b). https://www.bls.gov/oes/current/oes111021.htm



A.13 Estimate of Total Capital and Startup Costs/Operation and Maintenance Costs to Respondents or Record-Keepers

There are no direct or start-up costs to respondents associated with the proposed primary data collection.

A.14 Estimates of Costs to the Federal Government

The total cost to the federal government for this study is $172,940 (Table A5). This cost includes the costs incurred for designing and administrating all collection instruments, processing and analyzing the data, and preparing reports. The average annual cost is $69,176 (the total cost divided by 2.5 years of the study).

Shape7

Table A5. Estimated total cost by category

Cost Category

Estimated Costs

Instrument development and OMB clearance

$34,913

Data collection

$85,993

Analysis/Reporting

$52,034

Total costs over the request period

$172,940

Annual costs

$69,176

A.15 Changes in Burden

This is a new collection.

A.16 Plans for Analysis, Publication, and Schedule

A.16.1 Analysis plans

This study team will use descriptive, comparative, and qualitative analysis to address the study’s research questions. Below, is the main focus for each type of analysis.

  • Descriptive analyses will be used to summarize information such as SFEC activities, services, topics, populations served, and challenges. The study will generate basic descriptive statistics (frequency, percentages, means, minimums, and maximums), but because of the small number of cases in the study, the results will primarily report frequencies. For measures using continuous scales, the study team will calculate means and standard deviations to describe central tendency and variation. For categorical scales, the study team will use frequency distributions and percentages.

  • Comparative analysis will be used to create cross-tabulations to illustrate differences between groups (for example, the emphasis on direct services for grantees that were also prior PIRC grantees and those who were not).

  • Qualitative analyses will be used minimally, to provide common descriptions and explanations across multiple SFECs or SEAs and provide illustrative examples of how some SFECs implemented their grants. For qualitative data from responses to open-ended questions, the study will characterize whether responses include specific features, concepts, or types of information. A coding process will create a set of response categories, making it possible to describe information from open-ended items across grantees.

The study team will summarize findings and present them in tables and exhibits in the final report. All text describing findings will be presented in plain language.

A.16.2 Time schedule and publication plans

This study contract period began on October 1, 2020, and will be completed on March 31, 2023. Data collection and reporting will be completed according to the following schedule (pending OMB approval):

  • May-June 2022: Conduct SFEC director survey and interviews

  • June-July 2022: Conduct SEA representative surveys and interviews

  • July 2022: Conduct analyses

  • July-October 2022: Prepare the draft report

  • October 2022: Conduct technical work group meeting

  • March 2023: Submit final report

A.17 Approval to Not Display Expiration Date

IES is not requesting a waiver for the display of the OMB approval number and expiration date. The instruments will display the OMB expiration date.

A.18 Exceptions to the Certification Statement

No exceptions to the certification statement are requested or required.





1 Objective 1.1. in the Department’s 2018–22 Strategic Plan is to “Increase high-quality educational options and empower students and parents to choose an education that meets their needs” (see https://www2.ed.gov/about/reports/strat/plan2018-22/strategic-plan.pdf).

2 https://www2.ed.gov/documents/essa-act-of-1965.pdf

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSFEC Supporting Statement Part A
AuthorMATHEMATICA
File Modified0000-00-00
File Created2022-03-27

© 2024 OMB.report | Privacy Policy