Memorandum United States Department of Education
Institute of Education Sciences
National Center for Education Statistics
DATE: April 10, 2017
TO: Robert Sivinski and E. Ann Carson, OMB
THROUGH: Kashka Kubzdela, OMB Liaison, NCES
FROM: Gail Mulligan and Carolyn Fidelman, NCES
Below please find NCES responses to OMB passback received on April 7, 2017.
1. What’s the purpose of including incentive recommendations for the main study?
NCES Response: It has been always the plan (per the approved MGLS:2017 Operational Field Test (OFT) and Recruitment for Main Study Base-year packages (OMB# 1850-0911 v.11-12) to decide the incentives by this point. MGLS already has begun recruitment for the main study and MGLS staff need to be able to communicate what are the incentives to schools.
2. We don’t see an update to the costs section of Part A to reflect that NCES is suggesting the $400 incentive.?
NCES Response: When the original package describes experiments and states that we will come to OMB with the results of the experiments to decide on what to use moving forward, on the following change request we usually do not revise the original Part A and B with the experiment results, we provide them in the change memo. This is the way, for example we provide results of the various calibration experiments in HSLS and now in BPS. We do this so that we preserve the full description of what we were going to do as far as experiments go, without having to rewrite most of Part A and B. The idea is that because they already contain the experiment description and the fact that we will provide experiment results at a later date, any reader will know that follow change request materials provide experiment results and decisions on what approaches to use going forward.
When we have an unanticipated change or addition to the planned recruitment and/or data collection procedures and/or materials, we always make those changes in the pertinent, last approved package documents. If something is being put in place for less than the full duration of the approved activity, we reflect it in the pertinent documents, as we did in MGLS.
3. I am are pretty concerned about the overall response rates. The OFT has been open for ~2 months - is NCES still getting responses? Does NCES have a minimum response rate below which it cannot disseminate the results? I worry that this is the first of 3 waves of data collection, and if $400 garners poor response rates, what will you need to spend in years 2 and 3 to get them to continue to participate?
I’m not convinced that the case has been made for the increased incentive. I understand that the OFT is more similar to the main study than the IVFT is, and so I appreciate having the results broken out. I also understand that the conditions under which the OFT was conducted (shortened period in which to collect data, not a full follow-up effort) would yield response rates as high as the main study. However, the comparison yielding the strongest support for increasing the incentive from $200 to $400 only “approaches significance”, on a very small number of schools.
After discussing it within OMB, I’d like to suggest that NCES extend the experiment to the main study to allow for (1) a larger sample size; (2) the full fielding period; and (3) the full nonresponse follow-up. If you randomly assign the incentive amounts (I’ll leave it up to NCES whether you do straight monetary incentives, repeating the non-monetary $400 incentive to the experiment, or allowing for the option to be chosen between non-monetary and monetary), this should allow you to come back with an answer that can be used for support not only in the 2nd wave of MGLS, but also in other secondary studies.
Does NCES have a standard for a minimum response level below which the collected data would not be considered valid?
NCES Response: [Kashka Kubzdela, Gail Mulligan (Longitudinal Surveys Branch Chief & ECLS-K:2011 Project Officer) and Carolyn Fidelman (MGLS Project Officer) spoke with Ann Carson of OMB at 11:30am 4/10/17 to discuss NCES’s justification for offering a $400 school incentive to obtain acceptable response rates in the recruitment of schools to the base-year of the MGLS:2017. The below is a summary of that conversation.]
MGLS:2017 main study base-year recruitment at the state and district level began in February 2017 and school level recruitment is to begin as soon as possible, upon OMB’s approval of this change request. The base-year recruitment plays a unique role in this longitudinal study in that it defines the sample for all future follow-ups. The schools selected for the initial sample will not be recruited for later rounds if they choose not to participate in the base-year, and students can only be sampled from schools that agree to participate in the base-year of the study. Thus this year’s recruitment (April 2017-May 2018) is critical for identifying the school and student sample for the entire study. The consequence of the failure to get a selected school to participate in the base-year is immediately depriving the study of an average of 30 sampled students. In addition, schools will only be added in later rounds of the study if students who participate in the base-year transfer to them. Nonresponse for schools to which participating students transfer after the base-year data collection has less impact on study results than school nonresponse in the base-year because nonresponse of transfer schools typically results in missing teacher and school information for just one or two sample students. While a further experiment in the base-year would provide useful information for other NCES studies, it would not inform the incentive level for later rounds of MGLS:2017 data collection, and it would put the MGLS:2017 at a serious risk of failure should a sufficient number of schools not agree to participate in the base-year.
It has been NCES’s intention in the two field test experiments to get as much evidence as possible for the correct approach to take but, as was noted, the sample size was not sufficient to adequately test the three different conditions. NCES sample survey field tests typically have very small numbers of sampled schools and as we funds typically do not permit samples of schools that are large enough to conduct fully powered experiments. The higher school response rates seen for the $400 conditions in the OFT are closer to what we’ve seen at the school level in other longitudinal studies, such as the ECLS-K:2011. The first field test, the IVFT, with its convenience sample may have in fact suppressed results compared to the second field test, the OFT, with its random sample (see Groves, R. M., Singer, E., & Corning, A. D. (2000), Leverage-saliency theory of survey participation. Public Opinion Quarterly 64(3): 299-309.).
With regards to NCES standards for a minimum response level, while NCES standards identify a nonresponse rate below which bias analyses must be conducted, there is no minimum response rate identified in the standards. Whether a response rate is too low is decided on a case-by-case basis, and some NCES studies have been stopped due to low response rates. MGLS would be no different should it fail to achieve a school- or student-level response rate the NCES Chief Statistician believes is acceptable.
NCES anticipates higher school participation rates in the MGLS:2017 main study base-year because all schools will be offered the $400 or equivalent incentives, each school will be given a choice of receiving a monetary incentive vs. a non-monetary equivalent, the data collection window will be longer, the nonresponse conversion strategy much more intense and elaborate, and, as described in the attached revised Part B and in the Change Memo, a number of additional efforts are being put into place to boos participation rates, such as the use of the information Webinars.
NCES will look for opportunities for further recruitment experiments in future longitudinal surveys to continue to obtain evidence about the effectiveness of different incentives, including offering $400 vs. $200. The first opportunity may be the new high school longitudinal study – HS&B:2020. However, the school sample size is likely to continue to limit the statistical power in field test experiments.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Memorandum United States Department of Education |
Author | audrey.pendleton |
File Modified | 0000-00-00 |
File Created | 2021-01-22 |