MGLS IV Field Test 2016 Recruitment Response to OMB Passback

MGLS IV Field Test 2016 Recruitment Response to OMB Passback.docx

MMiddle Grades Longitudinal Study of 2017–18 (MGLS:2017) Recruitment for 2016 Item Validation Field Test

MGLS IV Field Test 2016 Recruitment Response to OMB Passback

OMB: 1850-0911

Document [docx]
Download: docx | pdf


MEMORANDUM OMB# 1850-0911 v.3


DATE: June 30, 2015, revised July 18, 2015


TO: Shelly Martinez

Office of Information and Regulatory Affairs, Office of Management and Budget


FROM: Carolyn Fidelman

National Center for Education Statistics

THROUGH: Kashka Kubzdela

National Center for Education Statistics

SUBJECT: Response to OMB passback on Middle Grades Longitudinal Study of 2016–17 (MGLS:2017) Recruitment for 2016 Item Validation and Operational Field Tests


Thank you for your review of the Middle Grades Longitudinal Study of 2016–17 (MGLS:2017) Recruitment for 2016 Item Validation and Operational Field Tests OMB package. As you are aware, the MGLS:2017 contract has not yet been awarded. NCES has made the decision to shift the timing of the field tests and main study per the schedule below.


Original

Revised Periodicity

Item Validation Field Test

January through March 2016

January through June 2016

Operational Field Test

January through March 2016

January through March 2017

Main Study 6th Grade Collection

January through March 2017

January through March 2018

Main Study 7th Grade Follow-up

January through March 2018

January through March 2019

Main Study 8th Grade Follow-up

January through March 2019

January through March 2020


With this change in timing for the field tests and main study, we have revised the OMB package for field test recruitment to only include the Item Validation Field Test (IVFT). OMB packages for the IVFT data collection, the Operational Field recruitment and collection, and later all aspects of the main study will be submitted separately.

We have revised the OMB package to focus on the Item Validation Field Test, with a revised title of: Middle Grades Longitudinal Study of 2017–18 (MGLS:2017) Recruitment for 2016 Item Validation Field Test, and revised our response to question 4 below to reflect the time separation of the two field tests.

In response to the passback questions:

  1. Is there a maximum number of student per special ed teacher? We can envision a single teacher having responsibility for all special ed students in the school in some cases.

[Reply] A single teacher may be responsible for all special education students within the school; however, this may not occur frequently. Its frequency will likely reflect characteristics of the schools in our sample, including but not limited to, school size and inclusiveness of special education students in the general education classroom (i.e., mainstreaming practices). We looked to see if the CCD and/or SASS collect number of special education teachers per school; however they do not collect this information.

We were not planning to limit the number of child-level surveys a special education teacher completed during the field test, because we will use the collected data to inform main study collection procedures and conduct several psychometric analyses. Specifically, field test data will be used to examine procedural elements of the study, such as the impact that survey-time burden has on response rates and data quality (full completion of surveys), which may in turn suggest a need for greater strategic and innovative data collection procedures in the main study. Moreover, data collected during the field test will potentially be used to look at specific item-level functioning of teacher reports across different disability groups to determine if scales are functioning similarly for all groups of children in the sample. Limiting the number of special education teacher child-level surveys per teacher might also limit these analyses.

  1. What is the hypothesized response rate differences between 0, 20, and 40 dollars in the parent experiment? What were they in the HSLS and is that the basis here?

[Reply] In response to lower-than-desired parent response rates in the base year (9th grade collection) of the HSLS:09, an incentive experiment was implemented for the final 3 weeks of data collection. Sample members were assigned an incentive treatment of $0, $10, or $20, with parents from the same school receiving the same incentive treatment. Approximately 43 percent of parents who were offered $20 completed, while approximately 38 percent of those who were offered $10 completed an interview, as did 39 percent of parents offered nothing. There was approximately a 4 percent difference for the $20 group.

For the MGLS:2017 Field Test, we are hypothesizing a 4 percent difference between the $0 group and the $20 group and exploring if there is a greater difference between the $0 group and the $40 group.

  1. For the IVFT, since a convenience sample of only 50 schools, how will NCES establish equivalence at baseline to permit valid conclusions about the incentive experiment?

[Reply] Schools within the sample will be randomly assigned to experimental groups. Random assignment should minimize any differences between group characteristics across conditions. Additionally, to support the generalizability of our findings, we will look at the characteristics of schools and parent respondents to provide information on the diversity of the convenience sample. Generalizability should be supported because in as much as the IVFT is a convenience sample, the requirement is that the sample be diverse in order to support the psychometric analyses.

  1. Could you provide more detail on how NCES would pool results across both tests for the school incentive experiment?

[Reply] With the change in timing of the field tests, the school incentive experiment will first be conducted on the approximately 92 schools involved in recruitment for the IVFT (to get to a yield of 51 schools for the IVFT, we project contacting 92 schools per field test). See response to #6 below for more detail on pooling across the two field tests.

  1. What is the hypothesized difference in response rates between 0, 200, 400? Basis?

[Reply] The experiment proposed in the MGLS:2017 field test is $200 versus $400. Each school will be randomly assigned to one of the three experimental conditions. In Condition 1, the baseline condition, we will offer schools a $200 incentive for their participation. $200 is consistent with the amount that NCES offers for participation in other studies, such as the ECLS-K, ECLS-K:2011, TIMSS, and the Program for International Student Assessment (PISA). However, based on previous difficulties in recruiting schools for the originally approved MGLS field test and the general decline in school participation in NCES longitudinal studies over the years, we propose to also test offering one third of the sample schools $400 (Condition 2), and one third of schools a choice of one of seven non-monetary incentives equivalent to $400 (Condition 3).

As this is the first study of middle grades students of this scope and size, it is difficult to hypothesize the difference in response rates between the conditions. $200 appears to be the precedent set by other studies; however other studies have also experienced a decline in school-level response rates over time. For example, in 1998–99 the Early Childhood Longitudinal Study had a weighted school-level response rate of 74 percent, whereas 12 years later, the complementary ECLS-K:2011 study had a weighted school-level response rate of 63 percent. Additionally, there is evidence that response rates may be lower for schools that serve older students, as in the High School Longitudinal Study of 2009, which had a weighted school-level response rate of 56 percent. Therefore, finding effective strategies for gaining the cooperation of schools are of paramount importance. Exploring increasing the amount and changing the nature of the incentive is one strategy.

In addition to that strategy, during the recruitment effort, the MGLS:2017 recruiters will thoroughly document information gained during the recruitment contacts, including audio recordings of recruitment phone calls (with respondent’s permission). Therefore, the MGLS:2017 will have detailed paradata on the course and content of the conversations, including what information seemed most convincing and why schools pushed back and/or refused. The MGLS:2017 will use this information to qualitatively evaluate motivators and disincentives, including how much difference the incentives seemed to make.

Additionally, to further strengthen recruitment strategies for the national study in the spring of 2016, in conjunction with the IVFT and the OFT, and building on a similar effort in preparation for PISA 2015 (realizing that school administrators may differ in their perspective on participation in a longitudinal vs. cross-sectional study), the MGLS:2017 plans to conduct a focus group with school administrators in order to better understand both the barriers and benefits schools associate with participation in MGLS:2017 and to identify communication strategies that will help overcome barriers to participation (to be submitted under OMB# 1850-0803). The results of all these efforts will guide recruitment strategies and materials for the MGLS:2017 main study collection by helping us establish what factors help schools to say yes, what materials schools find valuable in their decision making, and what impact do nonmonetary and/or monetary incentives have on schools’ decisions on whether to participate.

Second passback from OMB (July 16, 2015):

  1. Given that the power is now halved for the school incentive experience, will the 2 conditions under $500 be worth testing? HSLS did an experiment, not mentioned in the MGLS materials, of $0 and $500 and found no difference. It was probably underpowered but that led them to not offer monetary schools for the full scale.

[Reply] In our original reply to your questions on the Item Validation Field Test (IVFT) OMB package we indicated that we would field the experiment in the IVFT and then employ what we learned in the Operational Field Test (OFT). Considering the sample power, we propose to conduct the experiment in both field tests, given that both field tests follow the same recruitment approach. When we planned to conduct the IVFT and OFT at the same time, we planned to conduct the same experiment in both field tests and to pool data across both studies for analysis. Combining data across both field tests increases the analytic sample size from 92 to 184 (92 schools from the IVFT and 92 schools from the OFT will be recruited, with the expected yield of 50-51 school participating in each data collection). In the analysis, we will control for field test membership by including a variable indicating the field test to which the school belonged and including an interaction term. We added this fact to the description of the experiment in Part A section A.9.

With regards to the HSLS:2009 experiment, within the base year field test the HSLS:09 attempted to recruit in 90 schools, yielding 41 schools that participated in the data collection. The HSLS:09 field test experiment compared the effect of a $500 check written to the school to a no-incentive condition. RTI (the data collection contractor) determined that the $500 check did not have an impact on the schools‘ decision. Anecdotally, based on several refusal conversion contacts, it can be said that some schools reported that the allowance was simply too small, while others reported that no amount of money would suffice.

The MGLS:2017 will include both elementary schools (6th grade is an elementary school grade in some districts) and middle grades schools. It is not yet clear whether the schools selected for MGSL:2017 field tests will behave like the high schools in HSLS:09, plus MGSL:2017 is testing an additional type of incentive. The control incentive amount for the MGLS:2017 field tests is based off the ECLS-K:2011 main study amount of $200. MGLS:2017 will test whether an increase from $200 to $400 incentive will influence elementary and middle school decision making with regard to their participation, and will also test whether offering $400 in a monetary form vs. the same amount in “goods and services” has a different impact. Anecdotally, some schools find the paperwork involved in being able to use the incentive money cumbersome, and therefore MGLS:2017 proposes to explore the influence of an increased monetary incentive versus equal value goods and services over the baseline condition of $200.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorjoconnell
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy