NonSub Change - Remove on-site visits and related experiment

ExCELS_Incentive and NonSub Change Request_02072022.docx

OPRE Study: Early Care and Education Leadership Study (ExCELS) Descriptive Study

NonSub Change - Remove on-site visits and related experiment

OMB: 0970-0582

Document [docx]
Download: docx | pdf

DATE: February 10, 2022

TO: Jordan Cohen, Office of Information and Regulatory Affairs, Office of Management and Budget

FROM: Nina Philipsen and Bonnie Mackintosh, Office of Planning, Research, and Evaluation, Administration for Children and Families

RE: Request for a change in respondent tokens of appreciation and other nonsubstantive changes for the Early Care and Education Leadership Study (ExCELS) Descriptive Study (OMB: 0970-0582)



Background

The Office of Planning, Research, and Evaluation within the Administration for Children and Families in the U.S. Department of Health and Human Services contracted with Mathematica and its subcontractor, the Institute for Early Education Leadership and Innovation at the University of Massachusetts Boston, to conduct the Early Care and Education Leadership Study (ExCELS). The purpose of ExCELS is to learn about leadership in center-based early care and education (ECE) settings, and better understand how leadership might improve the quality of care and education centers provide and outcomes for staff, children, and families. The ExCELS descriptive study will take place in spring 2022 and we plan to recruit 120 center-based child care settings. These will include centers that have at least one primary site leader (e.g., center director) in the building, receive funding from Head Start or the Child Care and Development Fund, and serve children whose ages range from birth to age 5 (but who are not yet in kindergarten). Data collection will include interviews with each center’s primary site leader and surveys for select center managers and all teaching staff.


  • Type of Request: Non-Substantive Change to Tokens of Appreciation Structure and Descriptive Study Materials


  • Study Features Salient to Request:


To support a successful data collection with high response rates, ExCELS was approved to offer a $50 gift card to respondents of a 60-minute teaching staff survey as part of an experiment with two different procedures for administering a token of appreciation —a $10 pre- $40 post gift- card remote structure, or an on-site visit to offer a $50 gift card upon survey completion—to determine which approach was more effective and cost efficient at obtaining high response rates. Results of this experiment were to be shared with OMB and combined with the results of two experiments that were part of the Assessing the Implementation and Cost of High Quality Early Care and Education project (ECE-ICHQ; OMB: 0970-0499) to contribute to a body of evidence about the effectiveness and efficiency in using different structures and delivery approaches for tokens of appreciation to support response from staff in ECE settings.


Due to the ongoing COVID-19 pandemic, we seek to remove site-visits from the data collection effort (only conduct them as-needed to improve response rates) and, as a result, remove the experiment.


  • Progress to Date: The study team is in final preparations to launch recruitment (in mid-February 2022) and data collection activities.


  • Previous Terms of Clearance:

The following tokens of appreciation for center staff for ExCELS were approved by OMB under Control Number 0970-0582 (12/06/2021):

  • Center managers: $25 gift card upon survey completion

  • Teaching staff in on-site visit group: $50 gift card total upon survey completion

  • Teaching staff in pre-post gift card remote group: $50 gift card total; a $10 gift card with invitation materials and $40 electronic gift card upon survey completion


  • Time Sensitivity: Initial recruitment mailings (approved and no changes needed) are scheduled to be mailed on February 7, 2022. Targeted outreach to select centers is scheduled to begin the week of February 14, 2022 (requires approval of nonsubstantive changes).


Request Overview

We are requesting approval to revise our data collection approach to the teaching staff survey and to eliminate the experiment related to administration procedures for tokens of appreciation. Due to the ongoing COVID-19 pandemic and its continued impact on child care centers, with particular pressures on staffing, we want to remove reliance on site visits for this data collection effort. It is difficult to predict whether the study team will be allowed to go on site by all centers in the one experimental group. Even if their physical presence is allowed, we are concerned about the extent of interaction between field staff and the teaching staff, given the staffing pressures many centers are experiencing. This interaction is essential to the value and efficacy of field visits to introduce the surveys and support response rates. Given the extent of unknowns with these two factors, we are concerned with the viability and value in relying on field visits and the quality of an experiment under these less-than-ideal conditions.


In light of these circumstances, we have the ability in this survey-focused data collection effort to pivot to a fully remote approach. We are requesting approval to revise the data collection approach and remove the experiment. We have considered options to adjust the experiment. Experiments hold value when they can test an enhancement to what the study team can offer beyond what they would typically do and when they can detect differences in response rates between experimental groups that can be supported by the sample size. The other enhancements we have considered to try to replicate the on-site visit, given the value of personal contact, have limitations to their implementation. In addition, we suspect that any enhancements or tests of the relative amounts of pre- and post-pay tokens of appreciation would produce small (3-5 percentage point) differences in response rates between the two experiment groups. As presented in the original Supporting Statement B, we have the power to detect differences that are significant once they reach the 7-9 percentage point range.


In light of the absence of an experiment, we request a change in the tokens of appreciation to lower the overall amount provided to teaching staff from $50 to $40, maintaining a pre-and post-pay structure. The token of appreciation for center managers would remain unchanged; these staff were not planned to be part of the experimental aspect. The proposed tokens of appreciation would be the following:

  • No Change - Center managers: $25 gift card upon survey completion

  • Update - Teaching staff: $40 gift card total; a $5 gift card with invitation materials and $35 electronic gift card upon survey completion

Mitigation to Date


We are requesting these changes to mitigate issues we might face during recruitment and data collection as a result of our original plan to include on-site visits. Prior to recruitment, centers were to be randomly assigned into the two experiment groups. If we proceed with the current experiment, it is possible that during recruitment, centers randomly assigned in the site-visit group might decline to participate in the study due to the site visit requirement or be prohibited from site visits altogether, given local conditions and policies related to COVID-19. At that point, the study team would have to choose whether to accept the center into the study or pass on the center. Either approach would negatively impact the experiment by decreasing the number of centers in the site-visit group and/or causing selection bias in the final classification of centers into both experiment groups.


Plans for Future Mitigation

  • The study team will use a small pre-pay amount ($5 gift card) with all teaching staff respondents. Pre-paid tokens of appreciation have been shown to be more effective than post-paid tokens of appreciation across a variety of studies.1,2

  • The study team will conduct follow-up in the form of email and paper reminders for all survey respondents after the initial invitation packets are mailed to respondents.

  • We plan to distribute electronic gift cards to respondents immediately after a respondent completes a survey on the web. During a site visit, a field representative would have distributed invitation packets, collected completed paper surveys, and handed out gift cards immediately after a respondent handed in their completed survey. The electronic gift card approach will mimic the distribution of gift cards during site visits.

  • If a center is not able to achieve an 70% response rate on the teaching staff surveys after we’ve exhausted respondent follow-up, we will offer centers the option of having one of our study representatives visit their center to conduct a short site visit and distribute survey invitation packets to staff who still had outstanding surveys.


Proposed Intervention for OIRA Approval

We propose to remove the experiment related to tokens of appreciation and offer all teaching staff the same total amount of $40 distributed as a physical $5 gift card with invitation materials and $35 electronic gift card upon survey completion.


Expected Benefits and Proposed Assessment


We expect that our proposed change to a fully remote data collection effort that includes tokens of appreciation as outlined in this request can still achieve the participation of at least 80 percent of the sample for the descriptive study in total, and across the subgroups of interest. This level of response is essential to conduct the analysis to support the development of the new measure of leadership in ECE settings (described in SSB, sections 2 and 7). At least an 80% response from teaching staff will provide the statistical precision (based on power analyses presented in SSB) to detect differences between subgroups of centers that is essential for establishing the reliability and validity of the new measure across a range of ECE settings and to test the hypothesized associations in the ExCELS theory of change. A high response rate helps ensure the survey results
are representative of staff perceptions about the center leadership and reduces the likelihood of nonresponse bias. A high response among teaching staff within each center is also essential to development of this center-level measure because it supports analysis of the variation in responses across teaching staff within centers as well as across centers. Too much within center variation in responses by teaching staff suggests that a center level measure is not an appropriate representation of the construct being measured, which in this case is leadership.


We expect that the amount of $40, combined with the strategies outlined in this request, will have the ability to reach the 80% response threshold. Our experience in the ECE-ICHQ study suggests that the response from ECE teaching staff is sensitive to tokens of appreciation. The recently completed ECE-ICHQ study achieved a response rate that was greater than 80% from ECE teaching staff for a 45-minute survey with a $50 token of appreciation. Although a lower amount may be somewhat less effective and we cannot know for certain the effect of a decrease from a $50 to a $40 token of appreciation, a token of appreciation that is lower than $40 for a 60 minute survey introduces a higher risk of not reaching the 80 percent response rate which, in turn, could have detrimental effects on the strength of the analysis. Research has shown that tokens of appreciation for respondents are effective in increasing response rates (see meta-analysis by Singer et al. 1999).3 More recently, Goldenberg et al. (2009) found that monetary incentives (such as tokens of appreciation) increased response rates and data quality over no incentive, and a higher incentive ($40 as compared to $20) led to an increased response rate.4


Nonsubstantive changes to materials


Site visits were referenced across various recruitment materials and the $50 total for teaching staff was described differently across our respondent materials based on the proposed experiment. In line with this request, we removed references to the site visits and revised the language describing the tokens of appreciation in the following materials and instruments:

  1. Center recruitment materials (Appendix B)

  2. Teaching staff survey respondent materials (Appendix E)

  3. Center recruitment call script (Instrument 1)

  4. Umbrella organization recruitment approval call script (Instrument 2)

  5. Engagement interview guide (Instrument 3)

  6. Staffing structure and leadership positions interview guide (Instrument 4)

  7. Teaching staff survey (Instrument 7)





Overview of Additional Requested Changes

Since receiving approval from OMB, we began to prepare and finalize materials for recruitment and data collection activities. As a result of these preparatory activities, we have identified some necessary non-substantive changes to the study’s, recruitment materials, respondent materials, and instruments.


  1. We added to the center recruitment call script (Instrument 1, Section E) to ensure a center is eligible for the study before advancing to other study activities, and help the study team prepare for data collection activities with the center. Specifically:

    1. We added text to confirm a center’s main phone number, mailing address, and physical address to ensure we have accurate information to mail respondent materials.

    2. We added text to confirm that center staff can receive gift cards that we offer to respondents for their participation.

    3. We added probes to ensure that we can identify the/a primary site leader that works in the building before moving forward with the center’s participation in other study activities. The study is intended to learn about leadership structures in centers with at least one manager located in the center’s physical location. We therefore need to ensure a center we recruit meets that criteria before moving forward.

  2. We added two clarifying questions to the engagement interview (Instrument 3, Section A) to allow centers into the study if they are part of a public school system, but they have a primary site leader in the building who is separate from the principal or school administrator.

  3. We added a statement to our two survey instruments (Center Manager Survey, Instrument 6; Teaching Staff Survey, Instrument 7) to acknowledge that some of the activities we are asking respondents about in the surveys might be occurring in-person or virtual due to the continued COVID-19 pandemic.

    1. The statement reads: “Some of the questions in the survey ask about meetings, collaborations, trainings, or other types of interactions that may be occurring at your center. Please think about in-person and virtual activities when answering these questions.”

    2. We added the statement to the “How to complete the survey” section of both surveys.

  4. We added two additional questions to the staffing structure and leadership positions interview guide (Instrument 4) to be able to obtain unique counts of teachers and assistant teachers that supervise teaching staff within their classroom and across other classrooms. We also wanted to note to primary site leaders that we would be able to collect a list of their teaching staff for the survey over the phone or through Box.



We have updated the supporting statements to reflect the change to the token of appreciation and all other requested changes described in this memorandum.





1 Singer, Eleanor and Ye, Conge, 2013. The Use and Effects of Incentives in Surveys. The ANNALS of the American Academy of Political and Social Science. Volume 645, Issue 1, January 2013, Pages 112-141

2 Andrew Mercer, Andrew Caporaso, David Cantor, Reanne Townsend, How Much Gets You How Much? Monetary Incentives and Response Rates in Household Surveys, Public Opinion Quarterly, Volume 79, Issue 1, Spring 2015, Pages 105–129

3 Singer E., N. Gebler, T. Raghunathan, J. V. Hoewyk, and K. McGonagle. “The Effect of Incentives In Interviewer-Mediated Surveys.” Journal of Official Statistics, vol. 15, no. 2, 1999, pp. 217–230.

4 Goldenberg K. L., D. McGrath, and L. Tan. “The Effects of Incentives on the Consumer Expenditure Interview Survey.” In Joint Statistical Meetings Proceedings, pp. 5985–5999. Alexandria, VA: American Statistical Association, 2009.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDeterding, Nicole (ACF)
File Modified0000-00-00
File Created2022-05-20

© 2024 OMB.report | Privacy Policy