MGLS IV Field Test Recruitment Responses to OMB

MGLS IV Field Test 2016 Recruitment Response to OMB Passback.docx

Middle Grades Longitudinal Study of 2017-2018 (MGLS:2017) Recruitment for 2016 Item Validation Field Test

MGLS IV Field Test Recruitment Responses to OMB

OMB: 1850-0911

Document [docx]
Download: docx | pdf


MEMORANDUM OMB# 1850-0911 v.3


DATE: June 30, 2015


TO: Shelly Martinez

Office of Information and Regulatory Affairs, Office of Management and Budget


FROM: Carolyn Fidelman

National Center for Education Statistics

THROUGH: Kashka Kubzdela

National Center for Education Statistics

SUBJECT: Response to OMB passback on Middle Grades Longitudinal Study of 2016–17 (MGLS:2017) Recruitment for 2016 Item Validation and Operational Field Tests


Thank you for your review of the Middle Grades Longitudinal Study of 2016–17 (MGLS:2017) Recruitment for 2016 Item Validation and Operational Field Tests OMB package. As you are aware, the MGLS:2017 contract has not yet been awarded. NCES has made the decision to shift the timing of the field tests and main study per the schedule below.



Original

Revised Periodicity

Item Validation Field Test

January through March 2016

January through June 2016

Operational Field Test

January through March 2016

January through March 2017

Main Study 6th Grade Collection

January through March 2017

January through March 2018

Main Study 7th Grade Follow-up

January through March 2018

January through March 2019

Main Study 8th Grade Follow-up

January through March 2019

January through March 2020


With this change in timing for the field tests and main study, we have revised the OMB package for field test recruitment to only include the Item Validation Field Test (IVFT). OMB packages for the IVFT data collection, the Operational Field recruitment and collection, and later all aspects of the main study will be submitted separately.


We have revised the OMB package to focus on the Item Validation Field Test, with a revised title of: Middle Grades Longitudinal Study of 2017–18 (MGLS:2017) Recruitment for 2016 Item Validation Field Test, and revised our response to question 4 below to reflect the time separation of the two field tests.


In response to the passback questions:


  1. Is there a maximum number of student per special ed teacher? We can envision a single teacher having responsibility for all special ed students in the school in some cases.


[Reply] A single teacher may be responsible for all special education students within the school; however, this may not occur frequently. Its frequency will likely reflect characteristics of the schools in our sample, including but not limited to, school size and inclusiveness of special education students in the general education classroom (i.e., mainstreaming practices). We looked to see if the CCD and/or SASS collect number of special education teachers per school; however they do not collect this information.


We were not planning to limit the number of child-level surveys a special education teacher completed during the field test, because we will use the collected data to inform main study collection procedures and conduct several psychometric analyses. Specifically, field test data will be used to examine procedural elements of the study, such as the impact that survey-time burden has on response rates and data quality (full completion of surveys), which may in turn suggest a need for greater strategic and innovative data collection procedures in the main study. Moreover, data collected during the field test will potentially be used to look at specific item-level functioning of teacher reports across different disability groups to determine if scales are functioning similarly for all groups of children in the sample. Limiting the number of special education teacher child-level surveys per teacher might also limit these analyses.


  1. What is the hypothesized response rate differences between 0, 20, and 40 dollars in the parent experiment? What were they in the HSLS and is that the basis here?


[Reply] In response to lower-than-desired parent response rates in the base year (9th grade collection) of the HSLS:09, an incentive experiment was implemented for the final 3 weeks of data collection. Sample members were assigned an incentive treatment of $0, $10, or $20, with parents from the same school receiving the same incentive treatment. Approximately 43 percent of parents who were offered $20 completed, while approximately 38 percent of those who were offered $10 completed an interview, as did 39 percent of parents offered nothing. There was approximately a 4 percent difference for the $20 group.


For the MGLS:2017 Field Test, we are hypothesizing a 4 percent difference between the $0 group and the $20 group and exploring if there is a greater difference between the $0 group and the $40 group.


  1. For the IVFT, since a convenience sample of only 50 schools, how will NCES establish equivalence at baseline to permit valid conclusions about the incentive experiment?


[Reply] Schools within the sample will be randomly assigned to experimental groups. Random assignment should minimize any differences between group characteristics across conditions. Additionally, to support the generalizability of our findings, we will look at the characteristics of schools and parent respondents to provide information on the diversity of the convenience sample. Generalizability should be supported because in as much as the IVFT is a convenience sample, the requirement is that the sample be diverse in order to support the psychometric analyses.


  1. Could you provide more detail on how NCES would pool results across both tests for the school incentive experiment?


[Reply] With the change in timing of the field tests, the school incentive experiment will focus on the approximately 92 schools involved in recruitment for the IVFT (to get to a yield of 51 schools for the IVFT, we project contacting 92 schools per field test).


  1. What is the hypothesized difference in response rates between 0, 200, 400? Basis?


[Reply] The experiment proposed in the MGLS:2017 field test is $200 versus $400. Each school will be randomly assigned to one of the three experimental conditions. In Condition 1, the baseline condition, we will offer schools a $200 incentive for their participation. $200 is consistent with the amount that NCES offers for participation in other studies, such as the ECLS-K, ECLS-K:2011, TIMSS, and the Program for International Student Assessment (PISA). However, based on previous difficulties in recruiting schools for the originally approved MGLS field test and the general decline in school participation in NCES longitudinal studies over the years, we propose to also test offering one third of the sample schools $400 (Condition 2), and one third of schools a choice of one of seven non-monetary incentives equivalent to $400 (Condition 3).


As this is the first study of middle grades students of this scope and size, it is difficult to hypothesize the difference in response rates between the conditions. $200 appears to be the precedent set by other studies; however other studies have also experienced a decline in school-level response rates over time. For example, in 1998–99 the Early Childhood Longitudinal Study had a weighted school-level response rate of 74 percent, whereas 12 years later, the complementary ECLS-K:2011 study had a weighted school-level response rate of 63 percent. Additionally, there is evidence that response rates may be lower for schools that serve older students, as in the High School Longitudinal Study of 2009, which had a weighted school-level response rate of 56 percent. Therefore, finding effective strategies for gaining the cooperation of schools are of paramount importance. Exploring increasing the amount and changing the nature of the incentive is one strategy.


In addition to that strategy, during the recruitment effort, the MGLS:2017 recruiters will thoroughly document information gained during the recruitment contacts, including audio recordings of recruitment phone calls (with respondent’s permission). Therefore, the MGLS:2017 will have detailed paradata on the course and content of the conversations, including what information seemed most convincing and why schools pushed back and/or refused. The MGLS:2017 will use this information to qualitatively evaluate motivators and disincentives, including how much difference the incentives seemed to make.


Additionally, to further strengthen recruitment strategies for the national study in the spring of 2016, in conjunction with the IVFT and the OFT, and building on a similar effort in preparation for PISA 2015 (realizing that school administrators may differ in their perspective on participation in a longitudinal vs. cross-sectional study), the MGLS:2017 plans to conduct a focus group with school administrators in order to better understand both the barriers and benefits schools associate with participation in MGLS:2017 and to identify communication strategies that will help overcome barriers to participation (to be submitted under OMB# 1850-0803). The results of all these efforts will guide recruitment strategies and materials for the MGLS:2017 main study collection by helping us establish what factors help schools to say yes, what materials schools find valuable in their decision making, and what impact do nonmonetary and/or monetary incentives have on schools’ decisions on whether to participate.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorjoconnell
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy