Update to Burden and Study Materials

BTLS 2010 and 2011 Change Request Responses to OMB Passback.doc

Beginning Teacher Longitudinal Study (BTLS) 2009-2012

Update to Burden and Study Materials

OMB: 1850-0868

Document [doc]
Download: doc | pdf

December 29, 2010

MEMORANDUM



To: Shelly Martinez and Brian Harris-Kojetin, OMB

From: Freddie Cross and Kathy Chandler, NCES

Through: Kashka Kubzdela, NCES

Re: Response to 12/16, 12/18, and 12/23 OMB Passback on BTLS 2010 & 2011 Change Request & Terms Compliance Fulfillment (OMB# 1850-0868 v.2)



Responses to OMB passback questions on the BTLS 2010 & 2011 Change Request & Terms Compliance Fulfillment clearance package:


A) It doesn't look like the informed consent memo says anything about how consent regarding the longitudinal nature of studies (ie, multiple repeated follow ups) is handled. Rather is talks about how other longitudinal studies handle things like confidentiality and sponsor. It seems like it just misses the point of our last term of clearance. In addition, I'd prefer to see a little editing of the reference to duncan, jabine reference, ie, for an agency to say something is required may imply a legal requirement, when I think it may be that the cite suggests something more like a good practice.


NCES:  The Informed Consent Memo was revised and provided to OMB on 12/17 (see BTLS Informed Consent Memo 2010-12-17.doc). The full report from the BTLS cognitive interviews conducted in 2010 is also attached (see BTLSCogLabRept 091010.pdf).


B) On incentives, nces provides an analysis of simple response rate increase. Is there a fuller rpt on this experiment from which this is drawn? There are other aspects that seem quite relevant such as whether the incentive improves rates for key subgroups of interest and how the increased incentive affects the per case cost (ie, what is the cost expended v cost savings).


NCES:  Although the sample size in the experiment was not large enough to allow for analysis of subgroups, the Incentive Experiment summary was revised to reflect the projected cost expended versus cost savings if in Wave 4 $20 incentive was offered to all respondents as opposed $10 (see BTLS 2009 Incentive Experiment Results.doc).


OMB: I reviewed the incentive memo and believe that BLTS should stick with the $10 incentive as there appears to be very little benefit of doubling the incentive to $20.


NCES: After considering the possibility of giving $10 and $20 to those who received it last, we decided that this was not a viable option because:  1) We can’t count on $20/$10 over the long haul, so better to change now before long-term expectations are set.  2) We can better explain the change now (should anyone ask) as a reaction to the “bad economy” along with the fact that the $20 was part of a one-time incentive experiment.  3) Since our documentation will be released this year, our cohort may learn of the differential and that, alone, could have a more negative impact on response rates.  4) We haven’t thought through how the differential incentives might impact the Lego experiment.  5) We can add to the research literature on the impact of lowering incentives.  (Of course, our findings may not hold in different economic climates, but someone has to start somewhere.) Because giving $20 is not acceptable to OMB, we will give $10 to all BTLS respondents.


1)    There are two items, W3OCCST and OCCST, which ask similar questions. However, OCCST has an addition response option: "10 = Working in the field of K-12 education but not in a school/district". Why was this option not included for W3OCCST?


NCES:  W3OCCST is a retrospective question, asking Wave 3 non-respondents about their occupation last year, thereby filling in the gap in Wave 3 created by their non-response. Changing it would make the answers not comparable to the Wave 3 data already collected. We will change this question next year, when we will be collecting retrospective data from Wave 4 non-respondents, to match the occupation data that will be collected in Wave 4.

 

2)    Why are you getting rid of item LVASP? We realize that you are trying to streamline the majority of these items to reduce the burden of the survey, but we think it is an important question that is not covered by any other items on this list. We recommend that you should add this item back unless it is included elsewhere.


NCES: We were trying to streamline the items, and felt that LVASP (Indicate the level of importance EACH of the following played in your decision to leave your K-12 teaching position. Because I was dissatisfied with the support I received for preparing my students for student assessments at last year's school.) was no longer needed, given the addition of LVACC (Indicate the level of importance EACH of the following played in your decision to leave your pre-K-12 teaching position. Because I was dissatisfied with how student assessments and school accountability measures impacted my teaching or curriculum at last year's school.) However, NCES would be happy to discuss the items, should OMB have remaining concerns.


OMB: We still don’t think that the new item LVACC will capture the nuance. To us, LVACC gets at the impact of student assessments and school accountability have on teaching behavior, not the support they receive for preparing their students for assessments. It can certainly be interpreted that way but we’re not sure it would be intuitive as it is currently written.  Perhaps, you could include a note about what this could refer to this within the question? 


NCES: We changed the item to the following:  “Because I was dissatisfied with how student assessments and school accountability measures impacted my teaching or curriculum at last year’s school, including lack of support for preparing students”

 

3)    For items EVALI and EVALF, perhaps you should consider clarifying what is meant by "informal" and "formal". It is not clear what would qualify for either.


NCES: In the note associated with EVALF, we have included the sentence “FORMAL evaluation refers to an evaluation that becomes part of the employee record.”

 

4)    For item REPER, consider revising to be more clear. For example, "Indicate the level of importance each of the following played in your decision to return to the position of a pre-K - 12 teacher. Because of other personal life reasons that no longer required me to be out of teaching."


NCES:  We’ve found that the more specific an item is, the less people mark it, and put their answer in our catch-all category (Because of other reasons not listed) If respondents don’t consider their reasons as “requiring” them to be out of teaching (maybe instead they just preferred it temporarily), they may list their personal reason in our other category. We have revised the wording of the examples to read: “(e.g. change in health or pregnancy/childcare status, reduced need to care for family)”.


 

5)    Why did you drop RELOA? We assume that it is because the question is too narrow, but for postsecondary policy, it is important to know if student loan forgiveness is a successful incentive for recruiting teachers. Is there any item not on this list that asks if a teacher entered into the profession because of student loan forgiveness? If not, we recommend adding a question like that.


NCES: 85.71 percent of people who answered this question in Wave 3, answered that this was not at all important; 5.71 percent answered that it was somewhat important; 2.86 percent answered very important; and 5.71 percent answered extremely important.  Because we were trying to streamline these lines of questions we felt that if this was an important reason, respondents could still write it in the other category.  Also, if we really want to know the answer to this question, then we need to know if the school offered it as an incentive. Right now we cannot separate those who marked this as not important because it was not offered by the school and those who were offered the incentive but still thought it unimportant. However, NCES would be happy to discuss the item, should OMB have remaining concerns.


OMB: This particular item gets at if an individual returns to teaching because the district/school offered a loan forgiveness incentive. We’re fine with deleting this question; however, do we have any items on the survey getting at if an individual entered into teaching because of the federal student loan forgiveness?


NCES:  We will look into collecting more information on this topic in the 2015-16 SASS, which will be designed to include a much larger cohort of new teachers.  The 2011-12 SASS, like two earlier SASS collections, has an item on the District questionnaire concerning use of loan forgiveness as a recruitment tool.  The topic is not (and has not been) addressed in the teacher questionnaire.

 

6)    Why did you delete item REDES? We don't see a new question that gets at this concept and would recommend keeping this question.


NCES: This item (Because I obtained a position in a school with more desirable characteristics.) begs the question, more desirable than what? These respondents were not teaching the previous year, so it could be more desirable than their 07-08 school or than their 08-09 school (if they were teaching).  Additionally, “desirable characteristics” can mean different things to different people. If respondents are returning to teaching for a school they think is desirable, this could again be captured in the “other” option.  Also we do not have a comparable item on the “movers” list. However, NCES would be happy to discuss the item further, should OMB have remaining concerns.


OMB: We understand the rationale for deleting this item for individuals returning to teaching. We’d still like to discuss how this is measured for “movers”.


NCES:  We currently ask movers if they moved to a new school ”Because I wanted the opportunity to teach at my current school.”  While it’s not quite directly comparable to “a school with more desirable characteristics”, we believe the current wording is clearer and  captures when a respondent is drawn to a given school for a multitude of reasons.

 

7)    Why are you deleting MVMST? We suggest keeping this question or something similar question and having a separate question that addresses English Language Learners.


NCES:  91.87 percent of respondents, who answered this question in Wave 3, marked “not at all important”; 3.25 percent answered slightly important; 3.25 percent answered somewhat important; 1.63 percent answered very important; and 0 percent answered extremely important.  If this issue is important to a respondent’s choice in moving schools, they will be able to write it in the “other” option. However, NCES would be happy to discuss the item further, should OMB have remaining concerns.


OMB: It seems that a solution for streamlining these survey items is to use the write-in item to capture exact reasons for teacher behavior. How will write-in responses be captured in the statistics? For example, will we data on what percentage of teachers changed schools because of lack of preparation to mainstream students who are learning disabled?


NCES:  Currently when reporting on the write-in responses for this item, we are reporting them as “other”. However, data users will be able to manually code other categories they feel are important to their analysis.  Also just to note, the percent of movers in the Wave 3 sample is only about 6%, so of those 6%, less than 5% thought this item was more than “slightly important.”  While this item is an important question, the small sample size for BTLS coupled with the small percentage of respondents who mark this as an important reason means that it has limited analytical use.

File Typeapplication/msword
File TitleResponses to OMB passback questions on the 2011-12 SASS Preliminary Field Activities clearance package:
AuthorAuthorised User
Last Modified By#Administrator
File Modified2010-12-30
File Created2010-12-21

© 2024 OMB.report | Privacy Policy