Supporting Statement B_The Study of Disability Services Coordinators and Inclusion in Head Start 9-15 CLEAN

Supporting Statement B_The Study of Disability Services Coordinators and Inclusion in Head Start 9-15 CLEAN.docx

The Study of Disability Services Coordinators and Inclusion in Head Start, 2019-2024

OMB: 0970-0585

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes






The Study of Disability Services Coordinators and Inclusion in Head Start, 2019-2024


OMB Information Collection Request

New Data Collection





Supporting Statement

Part B



June 2021








Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers: Laura Hoard (ACF) and Wendy DeCourcey (ACF)



Part B


B1. Objectives

Study Objectives

The Study of Disability Services Coordinators and Inclusion in Head Start will provide the Office of Planning, Research & Evaluation (OPRE) and the Office of Head Start (OHS) in the Administration for Children and Families (ACF) with a national picture of the Disability Services Coordinator (DSC) workforce and services provided to children with disabilities and their families within Early Head Start (EHS) and Head Start (HS) programs.

The study objectives are to:

1: Describe the characteristics and work of DSCs and related staff in Early Head Start (EHS)/Head Start (HS) programs



2: Identify how EHS/HS serves children with disabilities and their families, including recruitment and selection; screening and ongoing assessment; evaluation; and the Individualized Family Service Plan (IFSP) and Individualized Education Plan (IEP) process and implementation

3: Identify how EHS/HS programs engage in capacity building with families and provide supportive services to families as they understand and advocate for their children with potential or identified disabilities, delays, or other issues, such as early childhood mental health concerns or chronic health impairments

4: Identify what EHS/HS programs do when services are not available and/or when children do not meet Individuals with Disability Education Act (IDEA) eligibility requirements



5: Identify how EHS/HS programs engage with services in the community, including Local Education Agencies (LEAs), IDEA Parts B and C providers, early intervention services, mental health providers, and community programs



6: Identify the training teachers receive as well as how they individualize practice and work to fully integrate children with disabilities into the classroom

7: Identify how EHS/HS programs work with children with disabilities and their families on transitions to HS or kindergarten


The study will support ACF in better understanding the implementation of EHS/HS policies and practices for delivering disability services. The study will report on inclusive practices, staffing, professional development, and collaboration with local education agencies (LEAs), early intervention (EI) programs, health providers, and other community stakeholders who serve young children at risk for or with disabilities and their families. ACF’s goals for the Study of DSCs in EHS/HS are consistent with its commitment to enroll and provide high-quality, inclusive education to children with the greatest needs, including the legislative requirement that at least 10 percent of their funded slots be used to serve children with disabilities.


Generalizability of Results

This study is intended to produce a national picture of the characteristics and practices of the DSC workforce in EHS/HS programs, including American Indian/Alaska Native (AIAN; Region XI) and Migrant and Seasonal Head Start (MSHS; Region XII) grantees, during the data collection period.


Appropriateness of Study Design and Methods for Planned Uses

As described in SSA Section A.2 Study Design, the study comprises three phases of data collection: a survey of all EHS/HS program directors1 (PDs; Phase 1); a survey of all EHS/HS DSCs (Phase 2); and interviews (Phase 3) with a subset of EHS/HS DSCs who completed the Phase 2 survey. These descriptive surveys and interviews will provide the needed information to meet the project goals of informing Head Start regarding the objectives above. Phase 1 surveys will include questions for PDs regarding contact information for the grantees’ DSCs and program-level information that will place the DSC information (Phase 2) in context. There is no existing list of the all EHS/HS DSCs and so it is necessary that Phase 1 include all PDs as respondents to establish a contact list for Phase 2. DSCs invited to participate in Phase 3 interviews will be from programs that differ in terms of key factors such as program type and program size. The purpose of the Phase 3 interviews is to better understand particular practices, experiences, and approaches that were observed in the Phase 2 surveys.


To meet the goals listed in SSA Section A.2, the research team will identify the universe of DSCs by first contacting EHS/HS PDs using information available in the HS Program Information Report (PIR). The research team will then ask PDs to provide contact information for all DSCs in their program, as well as contact information for other staff who fulfill at least some of the functions of a DSC but have a different title. All of the EHS/HS DSCs and other staff, as identified by PDs in W1, will be invited to participate in the Phase 2 survey. This two-phase survey approach, though necessary in this context, may limit the representativeness of the findings regarding EHS/HS DSCs. Although data will be weighted to account for potential PD nonresponse in Phase 1 and potential EHS/HS DSC nonresponse in Phase 2, it is possible that some unaccounted-for characteristics associated with nonresponse and DSC characteristics and practices may potentially bias results. We will make explicit the data’s potential limitations in all products (e.g., briefs) that synthesize study findings. The Phase 3 interviews will be conducted with a subset of Phase 2 survey respondents. Phase 3 survey participants who work in EHS/HS programs that differ in terms of community urbanicity and program size will be purposively selected. The purpose of the Phase 3 interviews is to better understand particular practices, experiences, and approaches that were observed in the Phase 2 surveys.


As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.








B2. Methods and Design

Target Population

This study is intended to produce a national picture of the characteristics and work of EHS/HS DSC workforce. Our target population is the HS/EHS PDs and DSCs.

Based on guidance from a Tribal Research workgroup (consisting of academic Tribal researchers and federal staff with extensive Tribal program knowledge), the study team has developed an initial outreach approach for AIAN programs that are led by Tribal communities. Prior to data collection at Tribal AIAN Region XI programs, contact will be made with Tribal leadership to ascertain if there are research review requirements for program participation. If Tribal review requirements are identified, the study team will work to address those requirements prior to data collection at those programs. (See Attachment C).

If a Head Start or Early Head Start program is under the purview of a local school district, the district superintendent will be contacted to check on district research review requirements prior to data collection. The study team will work to address any requirements prior to data collection. (See Attachment C).

We will identify and contact PDs using data from the Head Start Enterprise System (HSES) and Program Information Report (PIR). According to PIR data from the 2018-2019 academic year, there are approximately 1,600 EHS/HS grantees and delegates. We are unable to currently estimate the number of DSCs, however Head Start Standards require at least one DSC per grantee.

Sampling

As no contact list is available for DSCs, we will collect data from all EHS/HS program directors to obtain contact information for DSCs and program-level information relevant to the provision of services to children with disabilities and their families. The responses from the EHS/HS Directors will serve as the sampling frame for our DSC survey.

At this time, it is not possible to determine which attributes of EHS/HS programs (e.g., size, location, demographics of the children and families served, staffing model, DSC background, relationships with community providers) contribute to differences in DSC roles, implementation, and processes. Given the limited information and potential sampling factors we will invite the entire sample of DSCs to respond to the survey.


Respondent Recruitment for Phase 3

In the Phase 2 survey we will ask DSCs if they would be interested in participating in an interview (Phase 3). Thirty-five (35) respondents will be purposely chosen from those that both agreed to participate and who meet certain program characteristic criteria (e.g., large program, small program, EHS only, HS only, EHS/HS combination, Region XI and XII). Thus each respondent will meet at least one of these criteria.


B3. Design of Data Collection Instruments

Development of Data Collection Instrument(s)

This is the first national study of the HS DSCs. To develop the survey, we engaged research, program and training and technical assistance (T/TA) stakeholders, who included representatives of Head Start Region XI (American Indian and Alaska Native [AIAN]) Stakeholder Workgroup along with other people familiar with HS policies and programs. We also reviewed existing data sources, surveys, T/TA materials2, and practice tools, and worked with federal staff to identify research questions aligned with constructs and study goals. Stakeholders provided feedback on study constructs, data collection methods, and data collection instruments.



We reviewed 18 quantitative (e.g., PIR; Head Start Health Managers Study; Head Start Family and Child Experiences Survey; EHS Baby Family and Child Experiences Survey; AIAN Head Start Family and Child Experiences Survey; Head Start Impact Study) data sources to identify areas where constructs of interest could be measured with previously fielded items. Next, the research team identified existing items from self-assessments and checklists developed for professional development and adapted those items for the survey. We developed original items where no existing items were available. The team then shared an initial pool of items with stakeholders, who provided feedback on relevance to the identified constructs and completeness of response items. Cognitive interviews were conducted with members of the target population to identify questions that were difficult to understand (e.g., too complex, poorly phrased, use of uncommon terms) and questions that were difficult to answer (e.g., difficulty with recall, inability to look up needed data, incomplete list of response options). The final survey was mapped onto the objectives described in Section B1 (see Attachment A for a table showing this mapping). A few sources of measurement error were identified through cognitive testing, and we revised to strengthen clarity and interpretability of both the questions and the response options.



For the interview development, we conferred with Office of Head Start and stakeholders to identify key areas in the survey that could be more valuable if fleshed out with in-depth information from the interviews. We conducted cognitive testing of the interview items with people familiar with EHS/HS and DSC responsibilities. Based on the cognitive testing, questions and response options were refined for clarity and meaning.



B4. Collection of Data and Quality Control

The research team will implement the surveys and conduct the qualitative interviews.



PHASE 1: Survey of EHS/HS Program Directors

For Phase 1, all EHS/HS grantee- and delegate-level directors will receive an emailed invitation to participate in the survey (see Attachment C Phase 1 Recruitment Materials). The email will include a clear rationale for the study and explain how the director can contribute to the survey effort. The research team will provide contact information to those who have questions prior to agreeing to participate and will follow up as needed. Respondents will be provided with a unique URL to complete the survey online. The initial emails will also coincide with a letter of support from OHS to further encourage participation (see Attachment C Phase 1 Recruitment Materials).

Given the essential study need for contact information about the DSCs, we will follow up with EHS/HS program directors who do not respond to the email invitation. We will send follow-up emails on a biweekly basis for a six-week period after the initial invitation. In addition, we will use follow-up calls to address gaps in response from key subgroups (e.g., program types; smaller programs, larger programs, each region, programs with ethnic and language diversity). (See Attachment C Phase 1 Recruitment materials). In addition to email outreach, OPRE, OHS, and the study team will use strategies to raise awareness of the study (see Attachment F EXAMPLE Study Awareness Documentation). Overviews and updates regarding the study will be sent through social media channels of OPRE, OHS, T/TA and other community partners. These materials will be customized for the audience and for the platform (e.g., for ACF leaders’ speech at DSC event; for regional office newsletter, etc.).

The research team will use a multi-mode data collection beginning with web-based data collection and transitioning to telephone collection as needed. Those who are not able to complete the survey using one of the available electronic methods (e.g., internet, personal digital assistance, smartphone, WebTV) will have the opportunity to complete the survey over the telephone with a trained interviewer.

PHASE 2: Survey of EHS/HS DSCs

Information from the Phase 1 survey establishes the sampling frame of EHS/HS DSCs for the Phase 2 online survey. For Phase 2, we will send an initial invite email in waves to all DSCs as they are identified (See Attachment D Phase 2 Recruitment). The email will include a clear rationale for the study and explain how the DSC can contribute to the survey effort. The invite email will coincide with a letter of support from OHS supporting the study and encouraging DSC participation. We will send follow-up emails on a biweekly basis for a six-week period after the initial invitation. Follow-up emails may also be targeted to address gaps in response from key subgroups (e.g., program types; smaller programs, larger programs, each region, programs with ethnic and language diversity). (See Attachment D Phase 2 Recruitment). Phone recruitment may also be used to address gaps in responses from key subgroups. In addition to email outreach, OPRE, OHS, and the study team will use strategies to raise awareness of the study (see Attachment F EXAMPLE Study Awareness Documentation). Overviews and updates regarding the study will be sent through social media channels of OPRE, OHS, T/TA and other community partners. These materials will be customized for the audience and for the platform (e.g., for ACF leaders’ speech at DSC event; for regional office newsletter, etc.).

For the Phase 2 DSC Survey, skip patterns will ensure that questions about infant/toddler services will be provided to EHS DSCs and questions about older preschooler services will be provided to HS DSCs. Those DSC respondents who indicate they are a DSC for both EHS and HS programs will be randomly assigned to either the HS or the EHS version of the survey.



The research team will use a multimode data collection beginning with web-based data collection and transitioning to telephone collection as requested. Those who are not able to complete the survey using one of the available electronic methods (e.g., internet, personal digital assistance, smartphone, WebTV) will have the opportunity to complete the survey over the telephone with a trained interviewer.

PHASE 3: DSC Interview

The interviews will gather data from DSCs. Respondents will indicate at the end of the Phase 2 survey if they are interested in participating in interviews. (See Attachment E Phase 3 Recruitment). Follow-up emails and phone recruitment will be used to encourage participation.



Interviews will occur by phone through the Conference Now platform or by videoconference via Zoom, depending on the respondent’s preference and access. A note-taker will join each interview to capture the participant’s responses. With the respondent’s permission, we will also audio-record the interview to reference, as needed, to ensure that the notes are accurate and comprehensive.



Data Collection Quality and Consistency for Phase 1, Phase 2, and Phase 3

For Phase 1 and Phase 2, a series of consistency and checks of the range of responses given will be built into the programming of questionnaires to prevent invalid responses from being recorded. The data collection team will thoroughly test the programmed questionnaires prior to collecting data, including a review of the data to confirm that responses are being recorded as expected. Throughout data collection, we will monitor the functioning of the questionnaire to detect potential technical issues and possible misinterpretation of questions by respondents. We will produce regular data quality assurance reports that collate questionnaire data across variables of high analytic value to help identify such issues and allow us to take corrective action.

In addition to questionnaire functioning, the research team monitors data collection progress carefully throughout the field period to achieve good response rates and representative data. Daily production reports will show how data collection is progressing, enable us to identify problem areas, and take remedial action quickly when needed. These reports will also allow us to monitor completion rates by sample subgroups in order to detect potential bias in response rates and pursue remedial action as needed. When we detect that a subgroup is completing surveys at lower rates, we will adjust our field procedures to boost completion among that group.

For Phase 3, to improve data collection quality, the research team will conduct a half-day training course for all interviewers to address general and respondent-specific concerns to ensure consistent, efficient, and culturally responsive data collection.

Topics will include:

  • Study purpose, research questions, and conceptual framework

  • Primary data collection measures and instruments (i.e., Phase 3 interview guide)

  • Respondent privacy and informed consent procedures

  • Note cleaning and coding and analysis procedures


To monitor field interviewer performance specifically, the team will develop a series of performance metrics built from multiple data sources (e.g., item nonresponse rates, questionnaire completion rates, case prompting rates). These metrics, which will be produced weekly, show where interviewers are deviating from the norm on key performance measures, thereby revealing areas where retraining or special coaching may be needed.  



B5. Response Rates and Potential Nonresponse Bias

Response Rates

We aim to achieve a 78 percent response rate for the Phase 1 survey and a 75 percent response rate for the Phase 2 survey. As shown in Exhibit 1, these response rates are justified by those achieved in other recent surveys of HS program administrators and managers. These studies used the same design we are proposing – first surveying the universe of HS program directors to obtain contact information on the target population and then fielding a second survey of HS managers using the information provided by directors. The 2016 Head Start Health Managers Survey (HSHM) and the 2019 Head Start Training and Technical Assistance (T/TA) Survey obtained response rates of more than 80 percent for both their Phase 1 director surveys and their Phase 2 surveys of other HS personnel identified by directors in Phase 1.



Exhibit 1: Expected Response Rates for Phase 1 and Phase 2 Surveys based on Similar Studies of Head Start Managers



Shape1



Nonresponse

Although we will encourage participation through clear and attractive materials as well as offering multiple modes of data collection (i.e., online or phone), we do anticipate some survey nonresponse Phase 1 and Phase 2. Each HS program director and DSC invited to participate in the online survey will be assigned a unique ID that will be used to track, in real time, who has responded to the survey. We will establish subgroups of interest based on a priori information available through the PIR about EHS/HS programs (for Phase 1 and Phase 2 responses) and background information about DSCs collected during Phase 1 (for Phase 2 responses). Potential subgroups of interest include: program size; geographic location (i.e., urban/rural); ACF Region; and program type (i.e., EHS, HS, or combined EHS/HS). For Phase 2 only, additional subgroups may include employment characteristics, such as full-time/part-time and HS employee or contractor. We will regularly monitor response rates by these subgroups to identify where additional outreach may be needed to obtain representativeness. In reporting our results, we will calculate nonresponse rates according to the standards promulgated by the American Association for Public Opinion Research, which involve calculating the response rate as the ratio of the number of eligible completed cases to the number of eligible cases.


Survey non-response

We will then apply nonresponse weights to the data. The nonresponse weights will account for known characteristics of the missing cases based on HS program information available in the PIR. These nonresponse weights will be designed based on program characteristics that will allow for reweighting the sample of survey respondents to be similar to the population of HS programs that were invited to participate. We will use logistic regression models to predict the propensity of an HS PD or a DSC participating in the survey and the inverse of the propensity as nonresponse weights. Extremely large weights will be trimmed to avoid outliers and influential observations. We will use these obtained weights throughout the analyses for inference. If non-response adjustments during data collection are not adequate, then multiple imputation may be undertaken for key variables.


Item non-response

Questionnaires are designed to minimize item non-response based on design work the research team has conducted on other questionnaires, such as the National Survey on Early Care and Education and the Early Childhood Training and Technical Assistance surveys. For example, we reduce the complexity of the question and narrow the focus to reduce the possibility of respondent skipping questions. In addition, stakeholder input on survey drafts and cognitive testing prior to administration will help identify questions that are difficult to complete and provide opportunities to reformat in ways that will increase user response to items.

The study team will examine item non-response to identify if there are patterns of missingness, such as increased nonresponse on individual items based on the size of the program. We will ensure these are documented clearly as a potential bias in the analyses. We will also discuss with stakeholders to gain perspectives on why bias may exist.



B6. Production of Estimates and Projections

We will produce estimates for official external release by OPRE that are intended to be generalizable to the population of EHS/HS DSCs described in Section B1. As our intent is a national survey of directors and DSCs, we will use weights to adjust for nonresponse, but not to account for non-selected sample members. As discussed above in section B.5, we will create calibrated weights to increase the precision of our estimates and account for nonresponse. Weights for Phase 1 responses will incorporate information on grantee characteristics provided through the HS PIR and HS Enterprise System. We will select characteristics that are associated both with nonresponse and participants’ responses to the survey questions. We anticipate that this will include the number of children served by a grantee; grantee region (as defined by OHS); and the types of program provided by the grantee (HS, EHS, home-based). Phase 2 responses will include an additional set of weights to account for nonresponse of HS DSCs sent the survey. The full sample frame for the Phase 2 is not available, so these nonresponse weights will be based on characteristics of the individual HS DSCs that were reported by respondents to the Phase 1 surveys. We anticipate that this will include employment status (full time versus part time), number of other responsibilities, and estimated number of years working in their current HS program.


The weighting adjustment factor is then computed as the inverse of the weighted response rate in each cell. Use of the sampling weights will enable unbiased estimation of descriptive statistics that are run on the variables. Selected data from the information collection will be made available to the public for secondary analysis. Datasets will include sampling weights as well as sample design variables to allow analysts to produce design-unbiased standard errors for their analysis. Study documentation will describe how these variables can be used with commonly available statistical software to produce valid population estimates.


Data Archiving

Survey data collected via this study will be archived and made available to the public for secondary data analysis. Interview data collected will also be archived if data can be sufficiently treated to protect PII. Selected program-level data from the Head Start Enterprise System and Head Start PIR administrative data systems will be incorporated into the archived data (e.g., number of children referred and evaluated for disabilities). In addition, state policy data that is publicly available will be incorporated in to the dataset, to allow for other researchers to examine DSC practice within the state context. The research team will implement masking strategies to ensure the confidentiality of survey and interview participants. We will prepare documentation for each data file, including codebooks and user manuals, which will describe each variable on each data file, methods for accessing each data file, guidance for using the weights, and any editing strategies employed. Datasets will include sampling weights to allow secondary analysts to produce nationally representative estimates. Study documentation will describe how these variables can be used with commonly available statistical software to produce valid population estimates.


We do not plan to make policy decisions off of data that is not representative or publish biased population estimates.



B7. Data Handling and Analysis

Data Handling

We will minimize errors in the data with skip patterns that avoid presentation of questions that respondents are inappropriate to the respondent characteristics (e.g., type of program). Questions will use techniques such as establishing ranges for numeric items, presenting in words any numbers entered, and ensuring the inability of data entry of invalid codes for fixed-coded items (such as number of years). Coding errors will be mitigated by implementing double-blind coding for 10 to 20 percent of the responses with reconciliation, and error rates exceeding 3 percent trigger a second round of more sophisticated coding. This will apply both to coding open-ended responses on the surveys as well as interview data from Phase 3. We implement strict quality assurance protocols throughout the development and implementation of the analysis plan, which includes reviews of newly-created composite and analytic variables, statistical programming code, and analytic output (including against comparison data when available).



Data Analysis

Quantitative Analyses. Phase 1 and Phase 2 surveys will yield data that we will analyze using quantitative methods. These approaches will enable us to make nationally representative estimates about EHS/HS programs. We will analyze the data collected to inform our research questions (see Section A2), which are linked to study objectives and survey items in Attachment A. We will address descriptive questions about who makes up the DSC workforce and key practices they support. Univariate statistics will examine the distribution of individual items. For example, frequency distributions will be calculated to generate summaries of survey items, as well as to examine variability in the data. For quantitative items, parameter estimates, such as variances and means, will be established. We will also run cross-tabulations to examine the relationship among items as well as conduct significance tests (e.g., t-tests, F-statistics) to determine whether there are meaningful differences between subgroups (e.g., program size, region, program type, rurality). If appropriate for the data, we may also run ANOVA (analysis of variance) or regressions to understand how variables relate to each other.


Our content analysis of open-ended survey items will entail systematic coding, creation of a hierarchy of codes, and cross-case and cross-source thematic analysis (see Section B7 for data handling procedures).


Qualitative Analyses. Prior to beginning qualitative data analysis, we will ensure that the interview notes are formatted to allow for efficient review and coding. We will use a grounded theory approach to identify a set of a priori codes related to each interview topic that reflect hypotheses, areas of interest, and subgroup characteristics (e.g., individual, grantee, site characteristics such as race/ethnicity, age, size, type of organization, geographic location). Initial a priori codes and subgroups of interest are presented in Exhibit 2.


Exhibit 2. A Priori Codes for Analyzing Qualitative Data

Section or Topic (Guide #)

A Priori Codes

Introduction (all interview guides)

Populations served; DSC Role-Fulfilling; DSC Role-Challenging

Partnerships with Community-based Disability Service Providers (Guide 1)

Collaboration-Early Intervention; Collaboration-LEA; Established Partners; Established Partners-Supports; DSC Role-Identify Partners; Identify Partners-Tools; Needed Partners; Needed Partners-Supports; Needed Partners-Challenges; Partner Services-Barriers; Partner Services-Facilitators; MOUs/IAs-Content; MOUs/IAs-Negotiations; Collaboration-Successful Practices; Responsiveness-External partners

Teacher Training and Other Professional Development (Guide 2)

Training Needs Assessment; Trainings Needed; Providing Trainings-Process; Providing Trainings-Frequency; Providing Trainings-Methods; Responsiveness-Teacher Trainings; Evaluation-Measures; Evaluation-Most Successful; Other Needed Supports

Recruitment and Enrollment of Children with Disabilities and Suspected Delays (Guide 3)

Recruitment strategies-Successful; Recruitment strategies-Local Tailoring; Recruitment strategies-Unsuccessful; Recruitment strategies-Mitigate Challenges; Representation of Local Diversity; Enrollment-Successful strategies; Enrollment-Challenges; Recruitment and Enrollment-Best Practices; 10% Requirement-Facilitators; 10% Requirement-Barriers; 10% Requirement-Recruit after met; 10% Requirement-Waivers

Family Collaboration (Guide 4)

Communicate with Families-Methods; Communicate with Families-Best Methods; Communicate with Families-Challenges; Communicate with Families-Mitigate Challenges; Family Supports-New Diagnosis; Family Resistance; Family Supports-Advocacy; Family Supports-Training; Foster Relationships-Strategies; Foster Relationships-Best Practices; Family Supports-Underserved Populations; Engage Diverse Families-Strategies; Engage Diverse Families-Staff Support; Family Collaboration-Challenges; Family Collaboration-Mitigate Challenges

Transitioning Children with Disabilities (Guide 5)

Transition Planning-Start Date; Transition Planning-Challenges; Transition Planning-Mitigate Challenges; Transition Collaboration-Internal Staff; Transition Collaboration-Receiving Setting; Transition Collaboration-Partners; Transition-Prepare Families; Responsiveness-Transition Process; Transitions-Best Practices; Transitions-Challenges; Transitions-Mitigate Challenges; Transitions-Lessons Learned; Program Transitions-Level of Satisfaction

Racial, ethnic, cultural, and linguistic responsiveness (all interview guides)

Responsiveness-External partners; Responsiveness-Teacher Trainings; Representation of Local Diversity; Engage Diverse Families-Strategies; Engage Diverse Families-Staff Support; Responsiveness-Transition Process


Following initial data collection, the research team will analyze the qualitative data. Since multiple people will be involved in data coding and analysis, steps must be taken to ensure analytic rigor and the systematization necessary to reduce the effects of subjectivity and selection bias. These steps include developing a codebook, ensuring inter-coder reliability, and using a qualitative data analysis software program (i.e., NVivo). The team will meet on a regular basis to discuss code application and general findings. This will create an opportunity to ensure that the codes are being applied consistently across analysts. Coders may also use the memo and comment functions in NVivo to flag segments of text for discussion during internal team meetings. Any changes to the codebook or coding process resulting from these discussions will be applied to previously-coded interview notes.


First, the team will produce a codebook reflecting the initial set of codes that will guide the coders in reviewing and coding the qualitative data. These codes were developed based on the draft interview guides, which reflect our study goals and research questions; they are broad enough in nature to facilitate consistent code application across coders. Then, we will import the interview notes and a priori codes into NVivo, and a team of trained coders will begin to review the notes and apply the a priori codes as well as identify additional codes that emerge from unanticipated patterns in the data. The coders will discuss these new codes during regular internal meetings and, upon agreement, add them to the codebook and apply them to any previously-coded data. These codes will allow evaluators to identify and group common themes as well as assess their overall strengths and implications.


The interview guides reference the DSCs’ responses to the Phase 2 survey and ask follow-up questions, which allow us to delve deeper into the nuances of different program policies, practices, successes, and challenges. As such, the qualitative analysis will supplement the nationally-representative quantitative data to produce a comprehensive picture of the DSC workforce.


Data Use

We will archive the data with supporting materials (e.g., codebooks, instruments), so that a wide variety of researchers and stakeholders can access, use, and duplicate any analyses conducted by the project. The codebooks will include data variables, data labels, and response options for each question. The accompanying User Guide will describe each dataset (from Phase 1, Phase 2, and Phase 3), explain the weights, and detail the processes for linking datasets. The User Guide will also include: a description of the study design and methods used to collect and analyze data; documentation of study approval (i.e., OMB and IRB) and consent forms; the Phase 1 and Phase 2 questionnaires; and the Phase 3 interview guides. No data currently exists regarding DSCs’ roles and the ways they address HS Performance Standards. Other researchers will be able to conduct secondary analyses on the study data to expand the influence of our findings on EHS/HS services for children with disabilities and their families.


The Data Tables Report will serve as the primary reference of the information collected. This report provides the summary data and subgroup analyses from the Phase 1 and Phase 2 surveys. It will also provide a description of the study design, methods, analytic approaches, and sampling information. The Data Tables Report will also include descriptive information about the sample. We will highlight findings through five study briefs, which OPRE will publish on its website and the research team will disseminate to various audiences. The topics were selected based on the research objectives (Section B.1) and stakeholder feedback. The briefs will be targeted for T/TA providers, EHS/HS program directors, and DSCs. The briefs will focus on: 1) partnerships with community-based disability service providers; 2) teacher training and other professional development; 3) recruitment and enrollment of children with disabilities and suspected delays; 4) family collaboration; 5) transitioning children with disabilities; and racial, ethnic, cultural, and linguistic responsiveness. Additional briefings and webinars conference presentations, and materials will be developed for the use of training and technical assistance providers and program office leadership.


B8. Contact Person(s)

Name

Affiliation

Email Address

Michael Lopez, PhD

NORC at the University of Chicago

[email protected]

Shannon TenBroeck, MA

NORC at the University of Chicago

[email protected]

Todd Grindal, EdD*

SRI International

[email protected]

Stacy Ehrlich, PhD*

NORC at the University of Chicago

[email protected]

Cornelia Taylor, PhD

SRI International

[email protected]

Rebecca Berger, PhD*

NORC at the University of Chicago

[email protected]

*Can answer questions about the statistical/methodological aspects of the design and analyses.



Attachments

Instrument 1: Survey of EHS/HS Program Directors (Phase 1)


Instrument 2: Survey of EHS/HS DSCs (Phase 2)


Instrument 3: DSC Interview (Phase 3)


Attachment A: Crosswalk Mapping Study Goals to Research Questions, Constructs, Instruments, and Item numbers 


Attachment B: Informed Consent Forms  


Attachment C: Phase 1 Director Survey Recruitment

Attachment D: Phase 2 Disability Service Coordinator Survey Recruitment

Attachment E: Phase 3 Disability Service Coordinator Interviews Recruitment

Attachment F: EXAMPLES Study Awareness

Attachment G: OPRE Response to Comments





1 Program is used to reference grantee or delegate agency.

2 Head Start Early Learning and Knowledge Center, the Head Start Center for Inclusion, and the Early Childhood Technical Assistance Center

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTodd Grindal
File Modified0000-00-00
File Created2022-09-20

© 2024 OMB.report | Privacy Policy