1875-NEW Supporting Statement Part B 11-14-12_clean

1875-NEW Supporting Statement Part B 11-14-12_clean.docx

Strategies For Preparing At-Risk Youth For Postsecondary Success

OMB: 1875-0269

Document [docx]
Download: docx | pdf







Strategies for Preparing At-Risk Youth for Postsecondary Success:


OMB Information Clearance Request – Supporting Statement Part B


November 14, 2012










Submitted to:


Michael Fong

Policy and Program Studies Service

Office of Planning, Evaluation and Policy Development

U.S. Department of Education

400 Maryland Avenue, SW

Washington, DC 20202



Prepared by:

Viki Young, Kaeli Keating, Kyle Goss, and Nyema Mitchell

SRI International


Contract Number GS-10F-0554N; Order Number ED-PEP-11-O-0090, Task Order 6

SRI Project P20791






Contents

List of Exhibits iv




Exhibits

Exhibit 1. Information to be Gathered or Confirmed in Screening Interview 7

Exhibit 2. Expected Number of Respondents in Initial Phone Screening Calls 8

Exhibit 3. Expected Number of Respondents in Each Case Study, by Role 8

Exhibit 4. Potential Sources for Gathering Background Information 9

Exhibit 5. Question Topics, by Respondent Type 11

Exhibit 5. Question Topics, by Respondent Type (continued) 12

Exhibit 6. SRI Researchers Responsible for Qualitative Data Analysis 15





Supporting Statement, Part B
Paperwork Reduction Act Submission

B. Collection of Information Employing Statistical Methods

B.1. Sampling Design

This study aims to describe practices and strategies that have some evidence suggesting they may be succeeding in preventing at-risk youth from dropping out and preparing them for postsecondary education or training. The case study sample will include programs implementing such strategies and practices. The purposive sample size of up to 15 cases, as determined by the scope of work, is most appropriate to explore the strategies that practitioners are implementing to meet the relatively new policy goals of preparing students at risk of dropping out for postsecondary education. The sample will be drawn from a list developed through the literature review and conversations with national experts.

Sampling Criteria

The sampled programs will meet three basic criteria to ensure relevance to practitioners and policymakers:

  1. Involve local public schools or public school districts. The local school or district can be in partnership with colleges, local business, and/or community-based organizations but the local school or district must have substantive involvement in the program. Such involvement can include fiscal arrangements, joint decision-making/stewardship, or direct program implementation.

  2. Target the specific student subpopulations of concern. Students served by the program must be identified as at risk of dropping out, already have dropped out, or at risk of not continuing to postsecondary education, and be under age 21.

  3. Collect data on program implementation, intermediate outcomes, or final outcomes that suggest that the strategies or practices are likely to improve the target population’s high school and postsecondary outcomes.

Furthermore, the literature review findings had clear implications for defining the categories of programs and strategies that should be represented in the sample:

  • Relatively few programs in the literature both target students clearly at risk of dropping out and explicitly prepare them for postsecondary education. Because research usually lags policy imperatives and new practices, the types of programs the Department is interested in focusing on are relatively new, not well represented in the research literature yet, and likely have preliminary or limited data pointing to their effectiveness. For this reason, the site selection process will include a post-OMB clearance screening (discussed below).

  • The continuum from dropout prevention to college enrollment entails multiple outcomes. Traditional dropout recovery programs and college-readiness programs that target students at risk of not continuing to postsecondary education tend to target different outcomes along that continuum but may nonetheless be useful to include in the case study sample. Dropout prevention programs can inform on strategies that engage disaffected students in school and help them achieve a high school diploma. College readiness programs can provide insights on supporting students who do not necessarily see themselves as college-goers successfully complete college-ready curriculum, gain “college knowledge” (Conley 2008), and successfully apply and transition to postsecondary education.

  • Literature on dropout recovery programs—those attempting to reengage youth who have already dropped out—rarely included school district participation. Again, dropout recovery programs in partnership with local school districts are relatively new and pragmatically constitute a different program category for sampling purposes.

These findings underpin distinct program categories relevant to defining the case study sample. In addition to meeting the initial three criteria, nominated programs must specifically represent one of the following types of programs to be included in the sample (in order of selection priority).

  1. Targets students at risk of dropping out and prepares them for postsecondary education or training

  2. Targets students at risk of dropping out and increases their chances of graduating from high school.

  3. Targets students at risk of not going to college and prepares them for postsecondary education or training

  4. Targets dropped-out youth and reengages them in high school and/or community college

In addition to these four program types, strategies examined in the literature are characterized by whether they serve specific, identified individuals (targeted) or whether they are a more schoolwide or comprehensive strategy. For example, targeted case management models typically use a needs assessment process to determine the different services each student in the program requires, whereas a schoolwide strategy might be reorganizing a school to increase the level of personalization experienced by students.

Given the fact that schools and districts often implement schoolwide strategies as part of whole school reform and do not necessarily use those strategies to combat dropout prevention specifically, the goals and target population of each program will determine its salience to this study. To the extent that strategies on the ground can be reasonably categorized as targeted or schoolwide, and that schoolwide strategies can be further classified as focusing on dropout prevention, the case study sample should represent these two types of strategies. Some researchers have argued that to be effective, programs need to offer both targeted and schoolwide strategies to meet diverse student needs (Balfanz, Herzog, and Mac Iver 2007) so specific cases may include both.

Finally, because programs serving at-risk youth often include partnerships with institutions of higher education, businesses, and/or community-based organizations to meet a wider range of needs, the final case study sample will include some partnership-based programs.

Sample Selection Process

Sample selection for this project will occur through a two-step process. The first step entails building a shortlist of potential case study sites. The research team has been identifying potential sites as cited in the literature, met with relevant Department staff, and talked with seven national experts to discuss potential sites. These sources will eventually yield a shortlist of sites.

The second step will be a post-OMB screening interview conducted with each site on the shortlist. The screening interview with the program manager will confirm that the case meets the sampling criteria listed in the previous section, as well as gather contextual information that will help prioritize the shortlist in the event that more than 15 sites meet the sampling criteria. Contextual information will include program longevity, which will indicate sustainability and program maturity; geography, which will indicate certain state policy factors (e.g., accountability for dropouts, availability of K–16 data); student recruitment practices and admission policies, which will indicate whether self-selection into the program (e.g., through an application process) needs to be considered in interpreting outcomes data; and the nature and extent of public school or district involvement, with a higher priority on more involvement. The screening interviews will also gather or confirm basic data on the programs including the grades served by the program and the number of students enrolled in the most recent year.
Exhibit 1 shows the data fields that will be populated during the calls. While as many cells as possible will be populated prior to the calls using publicly available sources, because of the nontraditional organization of many of the programs and schools, the screening interview will be essential to ensuring consistent counts of data across sites. These screening interviews will take place immediately after OMB approval.

Exhibit 1. Information to be Gathered or Confirmed in Screening Interview

  • Program strategies (e.g., adult advocate, academic enrichment, behavior intervention, increased personalization)

  • Program Goals

  • Number of students enrolled in 2011-2012

  • Grades served

  • Student recruitment/admissions policies

  • Availability and quality of data on outcomes

  • Short summary of recent outcomes

  • Nature of partnership with a district

  • Program longevity

  • Targeted student population

  • Size of budget

  • Funding source

The final site selection will occur in consultation between the Department and SRI. First, all shortlisted sites that do not meet the selection criteria based on the screening interview will be eliminated from consideration. Of the remaining sites, the research team will prioritize the first program type listed in the sampling criteria (targets students at risk of dropping out and prepares them for postsecondary education). If an insufficient number of cases meeting the sampling criteria falls into this program type, the team will then examine the second (targets students at risk of dropping out and retains them in high school) and third (targets students at risk of not going to college and prepares them for postsecondary education) program types to round out the sample. Because the Department has a particular interest in understanding how school districts undertake dropout recovery and because the literature review indicates that few school districts have dropout recovery programs for out-of-school youth, a small number of visits (approximately two to three) will be reserved for any such programs that meet the sampling criteria. Department staff will review the list of sites meeting the selection criteria and make a final determination based on their knowledge of the field.

The universe of programs aimed at both preventing dropouts among at-risk students and preparing them for postsecondary success is unknown. To our knowledge, no census of such programs exists. However, 58% of public school districts reported that “some” and 3% reported that “most” at-risk students in their district took “dual enrollment courses with a career/technical focus”, and 34% reported that “some” and 1% reported that “most” at-risk students took “dual enrollment courses with an academic focus” (Carver and Lewis 2011, p. 10). These data on at-risk students’ participation in academic dual enrollment courses—one common potential strategy for preparing at-risk students for postsecondary success—indicate that likely a very small proportion of public high schools are engaging the majority of their at-risk population in preparation for postsecondary education.

B.2. Procedures for Collection of Information

As described previously, this request relates to two different data collection activities. The first is an initial round of screening calls with program managers to determine the final sample of case study sites. The second is the actual site visits to conduct the case studies.

Initial Phone Screen

Once the data collection is fully approved by OMB, the research team will conduct screening calls with all potential case study sites. These calls will be no more than 45 minutes in length and will be conducted with the program manager in the universe of potential sites. The relevant contact person could hold one of many positions in a school or district. For example, he or she may be a district administrator responsible for dropout prevention programs, a principal at an alternative school in the district, a school-based social worker, or a guidance counselor at a school that runs a program. The interviews will be conducted using the protocol included in Attachment 6and collect information on the topics listed in Exhibit 1.

In cases where the nominated “site” is actually a large program operating at multiple locations, an initial screen will be conducted with a national program representative to request a potential site to visit. In those cases, an additional phone screen will be conducted with the site-based program manager.
Exhibit 2 shows the time burden expected from the phone screen portion of the data collection.

Exhibit 2. Expected Number of Respondents in Initial Phone Screening Calls

Role

Number. of Respondents

Time Per Person

Total Time Burden

Program Manager

50

45 minutes

37.5 hours

Site Visits

Once up to 15 sites have been selected based on the sampling plan, the selected sites will receive a letter from the Department (Attachment 5) explaining the purpose of the study. Following that contact, the research team will schedule the site visit for dates within the data collection period that is convenient for the program. As discussed above, the sample will include a diverse group of programs that will require a tailored design for each visit, but all visits will share basic characteristics. Exhibit 3 shows the variety of stakeholder groups that will be included in each site visit.

Exhibit 3. Expected Number of Respondents in Each Case Study, by Role

Role

Number of Respondents

Time
Per Person

Total Time Burden

Program Manager

1

5 hours (3.75 hours preparation, 1.25 hours interview)

5 hours

Principal

1

60 minutes

1 hour

Assistant Principal or Other School-Level Administrator

1

60 minutes

1 hour

Teachers/Program Staff

6

60 minutes

6 hours

Counselors/Social Workers

2

60 minutes

2 hours

District Administrators

3

60 minutes

3 hours

Partner Organization Staff

4

60 minutes

4 hours

Parent Focus Groups

5 (1 group of 5)

30 minutes

2.5 hours

TOTAL

23

-

24.5 hours

The program manager will be asked to help schedule the visit including selecting the best respondents in each role type. We will provide the program manager with guidelines about whom to select, for example a cross-section of teachers serving in the program if they span grade levels and/or content areas. Where possible, the manager will also schedule interviews with partner organizations in sites involving such partnerships. The program manager’s time to help plan and coordinate the site visit is included in the preparation time in Exhibit 3.

A two-person research team will conduct each of the site visits. Prior to the visits, each researcher will participate in a training session and will collect and review background information on their respective sites. In order to reduce burden on the sites, site visitors will gather background information from publicly available sources wherever possible. In addition to the database generated through the initial screening interview, background data will be gathered from sources listed in Exhibit 4. Applicable background data that is not publicly available will be requested from program managers prior to the visit.

Exhibit 4. Potential Sources for Gathering Background Information

  • School report cards for the last five years including disaggregated data on:

    • AYP status

    • Enrollment

    • Attendance Rates

    • Behavior Issues

    • Dropout Rates

    • Graduation Rates

  • School improvement plan

  • School or district policies on graduation requirements

  • District strategic plan

  • Most recent annual report or report from an evaluator

  • List of project partners

  • Course catalog or curriculum outlines

  • Documents outlining program recruitment policies (e.g., program applications, brochures)

Due to the likely variation in data quality across programs, districts, and states, all data will be confirmed while on site. However, such thorough preparation ensures each site visit team will be able to use the time on site efficiently, interpret respondents’ answers appropriately, and probe more deeply on topics for which the case may be particularly instructive.

The goal of all the interviews is to gather enough evidence for the researchers to effectively document the nature of the practices and strategies that the program uses to serve at-risk youth, the outcome data collected by the program, the contexts within which the program operates, program successes, and challenges faced by the intervention and potential solutions to these problems. Given the embedded nature of dropout prevention programs, the contextual information will be critical in allowing the practitioner audience of this study to understand the complexities of operating such programs.

Furthermore, as shown in Exhibit 5, the case study protocols ask multiple stakeholders their perspectives on a common core of topics that reflect the conceptual framework described in the introduction to Supporting Statement A (e.g., target student identification; descriptions of dropout prevention, college readiness, college knowledge supports and strategies; data use in identifying and monitoring students and evaluating program effectiveness; partnerships; implementation successes and challenges; and data on outcomes of interest).1 Different stakeholders will have different perspectives grounded in their respective roles, responsibilities, goals, and experiences. Asking them about a common core of topics enables the study team to triangulate across the information they provide to identify consistencies and inconsistencies and explanations for the implementation successes and challenges and outcomes that respondents discuss.

Exhibit 5. Question Topics, by Respondent Type

Constructs

District Administrator

Principal/ Asst. Prin.

Program Manager*

Guidance Counselor/ Social Worker

Teacher/ Program Staff

Parent Focus Group

Partner Organization Staff (IHE, CBO, business, social services)

Program

Program goals (re dropout prevention, college readiness, and/or dropout recovery)


X

X

X

X


X

Target population as defined and served by program

Vis a vis district priorities

X

X

X

X



Strategies and practices to prevent dropouts


X

X

Specific to role

Specific to role

X

X

Challenges in implementing dropout prevention strategies


X

X

Specific to role

Specific to role


X

Strategies and Practices to promote college readiness

Vis a vis district priorities

X

X

Specific to role

Specific to role

X

X

Challenges in implementing college readiness strategies

Vis a vis district priorities

Specific to role

X

Specific to role

Specific to role


X

Strategies and practices to recover out-of-school youth


X

X

Specific to role

Specific to role

X

X

Challenges in implementing dropout recovery strategies


Specific to role

X

Specific to role

Specific to role


X

Types of data used for identification and needs aassessment

X

Specific to role

X

X

X



Supports for/challenges to using data for program implementation

X

Specific to role

X

X

X



Capacity and sustainability (financial, human capital, other)

Financial

X

X

Specific to role

Professional development


X


Exhibit 5. Question Topics, by Respondent Type (continued)

Constructs

District Administrator

Principal/ Asst. Prin.

Program Manager*

Guidance Counselor/ Social Worker

Teacher/ Program Staff

Parent Focus Group

Partner Organization Staff (IHE, CBO, business, social services)

Outcomes

Perceived outcomes and potential improvements resulting from programs

X

X

X

X

X

X

X

Use of data to track outcomes

X

X

X

X

X


X

Policy Contexts

District and state policy as barriers/facilitators, needed changes

X

X

X




X

State data system policy and provisions

X

X






Partnerships

Partner role and responsibilities

X

X

X

If applicable



X

Benefits and challenges of partnership

X

X

X

If applicable



X

*If no Program Manager, then questions divided between Principal/Assistant Principal and Guidance Counselor.





Statistical Methodology

This study involves collection of qualitative data. Statistical methods are not applicable to this study.

Analysis Methods

The research team will follow an iterative approach to analyzing the case study data—one that begins before each site visit, continues while on site, and proceeds through the drafting of internal case study reports to cross-site analysis. Before they conduct the site visits, researchers will review relevant documents to better understand the local context and to tailor appropriate follow-up questions during interviews. During the visits, researchers will discuss the consistency of answers across respondents for the common core topics and, if necessary, fill in any gaps with subsequent interviews. Researchers also will discuss emerging themes that they may not have anticipated. These emerging themes may lead the researchers to probe more specifically in subsequent interviews to ensure that their interpretations are accurate. Engaging in this analytic process while on site will help refine data collection to capture the most important features of local dropout prevention and college-readiness strategies.

Once each visit is completed, researchers will draft their case study reports. They will systematically compare data from respondents and documents on each common core topic on the protocols to distill a comprehensive description of program details, evidence on outcomes, and relevant contexts. During this process, researchers will refine case-specific analytic themes to share at project-wide debriefings after all case studies are completed.

At the project-wide debriefing meetings, the team will discuss emerging themes from each site, compare the salience of those themes across the sites, and chart evidence that either confirms or disconfirms each theme. Because the case study sample will likely include programs across the four program types described in the sampling criteria, the team will analyze the findings by program type. The goal of the analysis will be to compare, contrast, and synthesize findings and propositions from the single cases to arrive at initial lessons that apply to each program type and possibly across all programs. While it is possible that common lessons may emerge from multiple cases, the lessons will not be generalizable beyond those sites because the sample is purposive and not intended to be representative of all programs that serve students at risk of dropping out.

Degree of Accuracy Needed

The research team will do everything possible to maximize the accuracy of the data collected for each of the case studies. All interviews (subject to the permission of the respondent) will be recorded to improve the accuracy of reporting. Furthermore, site visitors will attend detailed training and will review background information prior to planning their visit to ensure efficient, consistent, and accurate data collection. Finally, the program manager will have the opportunity to review a draft case study profile for factual accuracy.


Use of Periodic Data Collection

The team will only visit each site one time.

B.3. Methods for Maximizing Response Rate and Dealing with Nonresponse

Response Rate

Because the literature review and nominations by national experts suggest that a relatively small number of programs explicitly aim to take students at risk of dropping out and push them towards pursuing postsecondary education or training, it is essential that a high percentage of nominated programs agree to participate in the study.

SRI has extensive experience in gaining access to schools and districts for research purposes. Prior to site visits, the Department will provide each selected site with a letter describing the study (Attachment 5) and its importance for the field. This letter will also include the purpose of the case studies, information on the major topics addressed in the interviews, and how to learn more about the study. Additional key access strategies that the research team will use include having one researcher be the primary contact; using multiple methods (phone, email, mail if necessary) to communicate with the program manager; providing ample opportunities for the program manager to ask questions about the study; building in flexibility in working with multiple coordinators for scheduling if necessary; selecting mutually convenient dates; and providing easy-to-use tools such as scheduling templates to minimize the burden on the site.

To ensure that each relevant respondent group is represented in each case study, the research team will conduct interviews by phone at a later date in any case where respondents are unable to schedule a meeting during the site visit or become unavailable on short notice. Because the research team will work closely with the program manager to select respondents based on their role and will be flexible in scheduling the time and location of the interviews, a 100 percent response rate is anticipated.

Generalizability of the Sample

The research design for this project relies on a purposive sample intended to capture descriptive information on programs with evidence suggesting they may be having success in preventing at-risk youth from dropping out and preparing them for postsecondary education or training. As such, the findings will not be generalizable to any group of schools or districts. However, the study aims to build basic descriptive knowledge of programs and strategies at the nexus of two important policy areas—dropout prevention and attainment of postsecondary education or training—that is currently missing in the literature and that will be useful to practitioners and policymakers grappling with the urgent problem of reducing dropouts.

B.4. Test of Procedures and Methods

The research team has conducted internal pretesting of protocol items to ensure clarity. Additionally, the pretesting ensured that all protocols were aligned with the constructs detailed in the conceptual framework (described in the introduction of Supporting Statement A) and tied back to the research question, ensuring the protocols will capture all the information needed. The research team also constructed a matrix (Exhibit 5) to ensure that the protocol for each respondent type addresses all constructs relevant to that role.


Many of the protocol questions have been adapted from relevant questions used in other SRI studies. For example, questions related to providing student supports have their roots in questions developed and used as part of the recently completed four-year Texas High School Project evaluation. Several related studies discovered during the literature review also used relevant questions and constructs that contributed to the development of the protocols.

B.5. Consultations on Statistical Aspects of the Design

The nature of the study did not require consultation on statistical design issues.

Members of the research team who will be responsible for data collection and analysis are listed in Exhibit 6. These staff will also be responsible for the qualitative data analysis.

Exhibit 6. SRI Researchers Responsible for Qualitative Data Analysis

Name

E-mail Address

Phone Number

Viki Young, Project Leader

[email protected]

(650) 859-2751

Kaeli Keating, Deputy Project Leader

[email protected]

(703) 247-8554

Samantha Astudillo

[email protected]

(650) 859-4526

Kyle Goss

[email protected]

(703) 247-8547

Ann House

[email protected]

(650) 859-2426

Marianna Lyulchenko

[email protected]

(703) 247-8580

Nyema Mitchell

[email protected]

(703) 247-8606

Chris Padilla

[email protected]

(650) 859-3908

CJ Park

[email protected]

(703) 247-8522

Victoria Tse

[email protected]

(650) 859-5478

References

Alliance for Excellent Education. 2010. High school dropouts in America. Washington, DC: Alliance for Excellent Education: Alliance for Excellent Education.

American Institutes for Research (AIR)/SRI International (SRI). 2009. Six years and counting: The ECHSI matures. Washington, DC: AIR.

Amos, Jason. 2008. Dropouts, diplomas, and dollars: U.S. high schools and the nation’s economy. Washington, DC: Alliance for Excellent Education.

Bailey, Thomas, and Melinda Mechur Karp. 2003. Promoting college access and success: A review of credit-based transition programs. Washington, DC: U.S. Department of Education, Office of Vocational and Adult Education.

Balfanz, Robert, Liza Herzog, and Douglas J. Mac Iver. 2007. Preventing student disengagement and keeping students on the graduation path in urban middle-grades schools: Early identification and effective interventions. Educational Psychologist 42 (4): 223–235.

Balfanz, Robert, and Nettie Legters. 2004. Locating the dropout crisis: Which high schools produce the nation’s dropouts? Where are they located? Who attends them? Baltimore, MD: Center for Research on the Education of Students Placed at Risk at Johns Hopkins University.

Buschmann, Rob, and Joshua Haimson. 2008. Bring them back, move them forward: Case studies of programs preparing out-of-school youths for further education and careers. Washington, DC: Mathematica Policy Research, Inc.

Carnevale, Anthony P., and Stephen J. Rose. 2011. The undereducated American. Washington, DC: Center on Education and the Workforce, Georgetown University.

Carnevale, Anthony P., Nicole Smith, and Jeff Strohl. 2010. Help wanted: Projections of jobs and education requirements through 2018. Washington, DC: Center on Education and the Workforce, Georgetown University.

Carver, Priscilla Rouse and Laurie Lewis. 2011. Dropout Prevention Services and Programs in Public School Districts: 2010–11 (NCES 2011-037). U.S. Department of Education, National Center for Education Statistics. Washington, DC: Government Printing Office. Available at http://nces.ed.gov/pubs2011/2011037.pdf

Conley, David T. 2008. College knowledge: What it really takes for students to succeed and what we can do to get them ready. San Francisco, CA: Jossey-Bass.

Dynarski, Mark, Linda Clarke, Brian Cobb, Jeremy Finn, Russell Rumberger, and Jay Smink. 2008. Dropout prevention: A practice guide (NCEE 2008–4025). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Mac Iver, Martha Abele. 2009. Beginning with the end in mind: The school district office leadership role in closing the graduation gap for at-risk students. Educational Considerations 38 (1): 8–16.

Martin, Nancy, and Samuel Halperin. 2006. Whatever it takes: How twelve communities are reconnecting out-of-school youth. Washington, DC: American Youth Policy Forum.

Obama, Barack. 2009. Remarks by the President to the Hispanic Chamber of Commerce on a complete and competitive American education. Washington, DC: White House, Office of the Press Secretary.

Spillane, James P. 1996. School districts matter: Local educational authorities and state instructional policy. Educational Policy, 10(1): 63-87.

Spillane, James P. 1998. State policy and the non-monolithic nature of the local school district: Organizational and professional considerations. American Educational Research Journal 35(1): 33-63.

Steinberg, Adria, and Cheryl Almeida. 2011. Pathway to recovery: Implementing a back on track college model. Boston, MA: Jobs for the Future.

Swanson, Christopher B. 2011. “Nation turns a corner: Strong signs of improvement on graduation.” Diplomas Count. Education Week, June 9.

Swanson, Christopher B. 2004. Who graduates? Who doesn’t?: A statistical portrait of public high school graduation, class of 2001. Washington, DC: The Urban Institute.

Tierney, William G., Thomas Bailey, Jill Constantine, Neal Finkelstein, and Nicole Farmer Hurd. 2009. Helping students navigate the path to college: What high schools can do: A practice guide (NCEE #2009-4066). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

1 The initial phone screen protocol is not represented in exhibit 5 since any information collected during the call will be reconfirmed and further explored during the actual site visits. Attachment 14 details how the common core of topics, or constructs, correspond with the research questions.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorbgroover
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy