OMB FPRQ pilot-field Part B-1 REVISED Dec 2012

OMB FPRQ pilot-field Part B-1 REVISED Dec 2012.docx

Measurement Development: Family-Provider Relationship Quality (FPRQ)

OMB: 0970-0420

Document [docx]
Download: docx | pdf

Measurement Development:

Family-Provider Relationship Quality (FPRQ)


Pilot and Field Tests Data Collection


Request for OMB Review



Part B

Statistical Methodology
























August 2012 – Revised December 2012


B. STATISTICAL METHODS (USED FOR COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS)

B.1. Respondent Universe and Sampling Methods

Both the pilot test and the field test will use samples of convenience. That is, the samples will not be drawn from formal sampling frames and therefore will not be nationally representative of early care and education providers or of parents with young children in ECE programs. Because neither the pilot test nor the field test will be used for population estimates, no power analyses were conducted. Instead, sample sizes in the pilot test were selected in order to allow for analyses to identify weak items that need to be eliminated or replaced before the field test. For the field test, sample sizes were selected that will allow for Item Response Theory (IRT) analysis, a model-based approach to measurement and reliability that requires a sample size of a few hundred participants, and for Differential Item Function (DIF) analysis to determine if items function differently for different subgroups.


Based on recommendations from the Office of Head Start staff, the Office of Planning, Research, and Evaluation staff, and members of the Technical Work Group, Westat, OPRE and OHS staff identified several candidate sites for the pilot test. To minimize any bias due to the the use of sites already used in the study’s focus groups and cognitive interviews, the Washington, DC metro area and Chicago, IL were not considered as sampling sites for the pilot test. Four cities (Seattle, WA; San Francisco/San Mateo, CA; Atlanta, GA; and Minneapolis/St. Paul, MN) were initially identified based on (1) their representation of distinct regions of the country and (2) their diverse populations with respect to income, race/ethnicity, and home language.  Westat then polled the project’s Technical Work Group (TWG) members to select two sites for the pilot test. The TWG members selected Atlanta, GA and Seattle, WA. These cities were selected because they represented different geographic regions of the United States and because both had populations that were racially/ethnically and economically diverse, including migrant workers.


The field test will be conducted in 8 metropolitan areas of the country that are geographically diverse from each other to ensure representation from different parts of the United States. These sites have not yet been identified, but we will use a process similar to what was used for the pilot test. Specifically, sites will be selected to obtain a sample that is (1) economically and racially/ethnically diverse, and (2) includes enough parents whose primary language is Spanish in order to test the Spanish-language version of the measure. For the same reason that Washington, DC, and Chicago, IL, were not selected for the pilot test, Seattle, WA, and Atlanta, GA (in addition to Washington, DC, and Chicago, IL) will be excluded for sampling for the field test. San Francisco/San Mateo, CA, and Minneapolis/St. Paul, MN, are likely to be among the 8 sites sampled in the field test since they meet the criteria stated above and were not sampled from for the focus groups or the pilot test.


Pilot test sample size. As shown in Table B.1, the pilot test will include approximately 45 directors, 105 providers/teachers, and 300 parents from Atlanta, GA, and Seattle, WA. Equal numbers of centers and respondents will be sampled from each selected city. The sample sizes for centers, providers, and parents in the pilot test will allow for analyses to identify weak items that need to be eliminated or replaced before the field test.


Table B.1. Total expected number of respondents for the pilot and field tests

Respondent

Pilot Test

Field Test

Directors

45

240

Head Start/EHS

15

80

Centers

15

80

Home-based

15

80


ECE Providers


1051


4004

Head Start/EHS2

45

160

Centers

45

160

Home-based

15

80


Parents


3003


8005

Head Start/EHS

135

320

Centers

135

320

Home-based

30

160


Total Respondents


450


1440

*Note: This table includes only those respondents completing the surveys, after screening. A relatively high refusal rate is anticipated during the initial contact stage of the study because the FPRQ study is not a mandatory survey, and there are no plans for using prenotification letters. A larger number of directors will be screened because than expected to participate because we are cold-calling centers and child care programs to recruit study directors. We expect a high rate of refusal because many centers/child care programs may be too busy or may be determined ineligible during the screening process.  For child care providers and teachers, we use a similarly higher number for providers for these same reasons.  Low rates of parent participation are expected because we are relying on  flyers and study brochures to recruit parents. If a director allows, we may also present information to parents directly through a presentation at the program. Therefore, we plan to use a recruitment matrix that includes quotas (the maximum number of participants with particular characteristics that we will accept into the sample).  Once quotas are filled, no more volunteers with characteristics of the filled quota will be accepted. This strategy will ensure sample diversity and will help us narrow the field of volunteers.

1 Assumes sampling 3 providers per center (Head Start/Early Head Start and other centers) and 1 provider per home-based setting.

2 Includes center-based Head Start and Early Head Start only.

3 Assumes sampling 3 parents per Head Start/Early Head Start or center classroom and 2 parents per home-based setting.

4 Assumes sampling 2 providers per center (Head Start/Early Head Start and other centers) and 1 provider per home-based setting.

5 Assumes sampling 2 parents per provider regardless of setting.


To be eligible to participate, providers must teach or provide care at least 15 hours per week and they must provide care to at least one child not related to them. Parents must have a child aged 5 years or younger who is enrolled in an ECE program with an unrelated teacher/provider at least 15 hours per week. Fifteen hours a week was chosen since that allows for a child to be in care for about 5 hours 3 days a week, which focuses the sample of parents on those whose children are in some form of relatively regular child care/early education program, which will optimize the chances of the parent and provider having some kind (whether good or bad) of relationship.


Field test sample size. As described above, we plan to conduct the field test in 8 geographically different areas of the continental United States. One metropolitan area will be selected per region. As shown in Table B.1, the field test will consist of approximately 240 directors, 400 ECE providers/teachers, and 800 parents. The large sample size of the field test will allow for detailed psychometric analyses of the data to thoroughly test the measure and to do cross-group comparisons.



Every effort will be made to obtain diversity with respect to type of program (center-based, including Head Start/Early Head Start programs, and home-based), home language (English and Spanish), race and ethnicity of parents, family income (hi/low), urbanicity (rural, suburban, urban), and region of the country. During recruitment, we will monitor demographic information for respondents to ensure that we recruit a diverse sample in terms of three specific subgroups of interest: (1) type of early education/child care, (2) race/ethnicity, and (3) household income. Specifically:

  1. We will regulate our recruitment efforts such that one third of our sample of programs falls into each of the following three categories: private center; Head Start or Early Head Start program; and home-based program. Equal enrollment of each type of program will better enable us to do comparisons across types of programs.


  1. As part of the screening questions, recruiters will ask participants for their racial background and whether they are of Hispanic or Latino origin. We decided to aim to recruit percentages of participants that align closely with the national percentages for the three largest racial/ethnic groups (with a slight oversampling of Black, Non-Hispanic and Hispanic participants), in order to mirror national representation as closely as possible while still allowing us large enough samples to conduct the planned analyses. Based on 2010 Census data showing that 63.7 percent of the population is White-Only, Non-Hispanic; 12.6 percent are Black, Non-Hispanic; and 16.3 percent are Hispanic (U.S. Bureau of the Census, 2011), we aim to recruit 60 percent White, Non-Hispanic; 20 percent Black, Non-Hispanic; and 20 percent Hispanic of any race.


  1. Recruiters will also ask parents about their family’s household income during the screening process. For household income, we aim to recruit equal percentages of low income, middle income, and high income families. Low income will be defined as a household income of less than $25,000 (since the 2010 poverty threshold for a family of 4 with 2 children was $22,113); middle income will include families with household incomes between $25,000 and $74,999; and families with household incomes $75,000 or more will be considered high income.



Sample matrix:

Program type

Race/ethnicity

Income

Head Start/Early Head Start

Hispanic/Latino

Low (Less than $25,000)

Child Care center

White, not Hispanic

Middle ($25,000-$74,999)

Home-based program

Black, not Hispanic

High ($75,000 or more)


Other (Asian, AI/AN, etc.)




B.2. Procedures for the Collection of Information

The overall approach for both the pilot test and the field test is to recruit ECE centers first (i.e., center-based programs, Head Start/Early Head Start centers, and home-based programs), then to recruit ECE providers/teachers from within the centers which agree to participate, and subsequently to recruit parents from within the classrooms of participating ECE providers/teachers. Recruiting all types of respondents in this manner will allow us to match parents’ surveys to those of their children’s ECE provider/teachers.


Procedures for selection of ECE centers will be the same for both the pilot and the field tests. However, the information source from which we generate this list may change. Specifically, in recruiting programs for the pilot test, we will use online resources (i.e., the Head Start Locator) that provide the names and locations of programs in Atlanta and Seattle. Because data collection for the field test will be spread across eight cities, there will be many more programs to recruit. To ensure that our process is as efficient as possible, we plan to use sample lists of early education and child care programs developed for previously conducted studies instead of generating these lists ourselves. If lists from previous studies cannot be secured for some reason, however, we will create one again using online resources.


Pilot test. Two experienced field staff (called “recruiters”) in each selected city will be hired to recruit respondents for the pilot test. These recruiters will be trained to use a recruitment protocol with guidelines and suggested wording for recruiting ECE center directors, ECE lead providers/teachers, and parents of children (see Appendix B-1). These recruiters will also screen potential participants (see Appendix B-1). At least one of the recruiters in each city will be bilingual in English and Spanish. The recruiters will use laptops with a Field Management System (FMS) application program and a cell phone for their recruitment activities.


A list of ECE centers for the pilot test will be complied through online listings, calls with state and district officials responsible for ECE programs, and Head Start and Early Head Start listings. Recruiter’s case workload (i.e., list of ECE centers with director’s name, center name, address, and telephone number) will be pre-loaded into the FMS.


Each recruiter will be responsible for recruiting about 11 or 12 ECE programs, including community-based centers, Head Start/Early Head Start centers, and home-based programs. Once programs are recruited, the recruiters will schedule a visit with the directors at a time when ECE providers/teachers may be available for recruitment purposes. During the visit, the recruiters will meet briefly with the directors and will give them a package of survey materials. This package will include a cover letter, study brochure, a list of answers to “frequently asked questions” (FAQs), a consent form, the center director’s survey, and a check for $50 in appreciation for their participation (see Appendices B.2-B.10 for example recruitment materials).


During the meeting with the directors, the recruiters will review these materials with them and will respond to any questions the directors might have. The recruiters will then ask the directors to introduce them to the ECE lead providers/teachers in the program and for permission to post flyers and leave copies of the study brochure and FAQs in the participating classrooms to recruit parents.


When meeting with the ECE lead providers/teachers, recruiters will provide them with a packet of information materials about the study that includes the study brochure, the FAQs, and the provider/teacher survey along with a postage-paid envelope for returning the questionnaire. Teachers/providers will receive a $25 check in appreciation for their participation once their completed Provider/Teacher Survey is received by Westat by mail.


Appendix B-2 contains copies of the cover letters for directors, providers/teachers, and parents. Appendix B-3 contains copies of the study brochure, the FAQs, and the flyer to be posted in classrooms. A reminder postcard (Appendix B-4) will be sent if the completed survey is not returned to Westat within two weeks after they are distributed to participants.


Slightly different procedures will be used for recruiting parents of children enrolled in the sampled classrooms. First, recruiters will field calls from parents who are interested in volunteering for the study after having seen the flyers, brochures, or FAQs. The recruiter’s phone number is listed on each of these materials. If parents call Westat’s main office, they will be given the telephone number for the appropriate recruiter in the local area. Second, if recruiters are not getting a sufficient number of volunteers through the first method, they will strategize with the director about the best way to recruit parents in that program, and discuss options such as making a presentation to parents and/or mailing the project materials (e.g., cover letter, project brochure, and frequently asked questions sheet) to parents.


When parents call, the recruiters will ask them a set of screening questions to determine their eligibility for participation in the study (see recruitment protocol in Appendix B-1). Parents who volunteer to participate in the study will be eligible as long as they have a child who is 5 years old or younger and the child is enrolled in the classroom with a participating ECE provider/teacher who is not related to the child. Parents from non-participating providers/teachers’ classrooms will not be eligible to participate in the study because their survey data cannot be matched to the providers/teachers’ data. In addition, the recruiters will keep track of the characteristics of the recruited parents (i.e., income level, race/ethnicity, Spanish-language speakers) to ensure that a balanced level of demographic and socio-economic diversity is obtained as much as possible. We will strive to overrepresent low-income parents and racial/ethnic groups of interest to Head Start and OPRE.


Once recruited, Westat will mail the survey material package to parents. One-third of Head Start/Early Head Start parents will receive the parent survey designed to assess their relationship with their Family Service Worker (FSW) instead of their relationship with their child care provider/lead teacher. All other parents will receive the parent survey about providers/teachers. Other survey materials (i.e., the cover letter, study brochure, a copy of the frequently asked questions) will also be included in the parents’ package. Parents will receive $25 in appreciation for their participation once their completed parent survey is received by Westat by mail. As with the directors and providers, a reminder postcard (Appendix B-4) will be sent if the completed survey is not returned to Westat within two weeks.


Each recruiter will be responsible for enrolling, on average, 26 providers (three from each center-based program and 1 from each home-based program) and 75 parents (three from each center and Head Start/Early Head Start programs and 1 to 2 from each home-based program). One field manager will oversee the field work of the 4 recruiters. The field manager for the pilot test will be physically located at Westat. She will have access to the field management system (FMS) so she can monitor the progress of each case. She will hold weekly calls with the recruiters to review their case load and determine appropriate strategies if any problems are identified.


Field test. For the field test, we will sample two ECE lead teachers/providers from within each program and then two parents from within each ECE teacher/provider’s classroom. As in the pilot, one third of the Head Start/Early Head Start parents will be given a survey about their Family Service Worker instead of about their child’s ECE teacher/provider.


In general, the field test will follow the same procedures as the pilot test unless the results of the pilot indicate the need for improvement. The procedures for developing the list of ECE programs to enroll in the Field Management System for the field test may change to include the sampling frame of an existing study, such as the National Survey of Early Care and Education.


There will be two experienced recruiters in each metropolitan area. As with the pilot, at least one in each area will be bilingual in Spanish and English. Each recruiter will be responsible for recruiting approximately 15 programs, 25 ECE providers, and 50 parents. Two field managers will oversee the field work. One will be the same person who was the field manager for the pilot test, and she will be physically located at Westat. The other will be a field supervisor. The field supervisor will likely reside in one of the metropolitan areas selected for the field test.


Training. Westat will conduct a 3-day, Web-Ex training for the FPRQ pilot test in mid-January 2013. OPRE and OHS staff will be invited to join the training. The training will include a combination of lecture-style sessions, and practical, hands-on skill application so that trainees become familiar with the Field Management System and project procedures to recruit and select study participants.


Prior to training, recruiters will receive a binder including study protocol and supplemental materials that will be covered during training. Westat will also provide recruiters with laptops that will be used during training and throughout the recruitment process.


The first day of training will cover the background of the study, the data collection design and procedures, use of the Field Management System (FMS), and the daily activities recruiters will engage in. In the first session, trainees will be briefed on study details such as sponsorship, study purpose and goals, data collection activities, and special populations to be included in the study. In the second session, trainees will be briefed on the study design and processes that will be used to recruit early care and education centers, providers/teachers, and parents to participate in the study. Trainees will also become familiar with the recruitment protocol. In the third session, Westat staff will explain the design and functionality of the FMS and how recruiters will use the system to manage their workload. Trainees will also practice using the system on the laptops provided by Westat. In the final training session, Westat staff will provide an overview of the daily activities recruiters will engage in throughout the recruitment process, such as communicating with the home office, the use of cell phones and laptops provided by Westat, and administrative tasks such as travel reimbursement.


During the second day of training, trainees will be paired together to practice recruiting respondents using scripted role plays provided in their training binders. Trainees will coordinate with each other to complete the six role plays over the phone either Saturday or Sunday. The role plays will include two center director interviews, two provider/teacher interviews, and two parent interviews. Trainees will use these scripts to practice the techniques they learned during the first day of training.


In the third day, trainees will share their experiences practicing the role plays over the weekend and also share recommendations and insight from this experience. Westat staff will review frequently asked questions with trainees to practice answering participant questions. The moderator will also ask trainees questions about the study design, and processes covered earlier in training to reiterate important points. Following this practice session, Westat staff will conclude the training by reviewing previously covered topics.


B.3. Methods to Maximize Response Rates and Deal with Nonresponse


Engaging respondent interest and cooperation. The content of the cover letters, study brochures, and list of answers to “frequently asked questions” will be focused on communicating the legitimacy and importance of the study. Also, recruiter training for recruitment and nonresponse follow-up will focus on strategies for communicating the importance and legitimacy of the survey and gaining cooperation. In addition, respondents will be offered a token of appreciation for their participation.


Nonresponse follow-up. We will use paper copies of the surveys so that respondents can complete the surveys at times convenient to them. Respondents will return the completed surveys directly to Westat using postage-paid envelopes provided. A reminder postcard will be sent if the completed survey is not returned to Westat within two weeks. Westat will also collect the respondents’ telephone numbers and email addresses, if available, during the initial screening; we will follow-up with the respondents if the surveys are not returned within 3 weeks.


Spanish version. Spanish versions of the parent survey will be made available for Spanish-speaking parents. Also, one of the recruiters in each data collection area will be bilingual in English and Spanish.



B.4. Test of Procedures or Methods to be Undertaken

The proposed procedures for the pilot test have been successfully applied in other similar studies conducted by Westat. Also, as noted earlier, the pilot test will serve as the test of the data collection procedures for the field test. Any necessary revisions and improvement for data collection procedures will be made based on the findings from the pilot test.

B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The persons listed below participated in the study design and are responsible for the collection and analysis of the data for the pilot and field tests.


  • Nancy Geyelin Margie, OPRE/ACF/DHHS

  • Christine Nord, Westat

  • Frank Jenkins, Westat

  • Stephanie Marken, Westat

  • Lina Guzman, Child Trends

  • Nikki Forry, Child Trends

Appendices


Appendix B-1: Recruitment Protocols/Screeners

Appendix B-2: Cover Letter for Directors

Appendix B-3: Cover Letter for Providers/Teachers

Appendix B-4: Cover Letter for Parents

Appendix B-5: Sponsor Cover Letter

Appendix B-6: Study Brochure

Appendix B-7: Frequently Asked Questions for Directors

Appendix B-8: Frequently Asked Questions for Providers/Teachers

Appendix B-9: Frequently Asked Questions for Parents

Appendix B-10: Study Flyer




B-2

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorjwest
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy