Response to OMB Comments

Responses to OMB Questions for TFS passback3.doc

Teacher Follow-up Survey

Response to OMB Comments

OMB: 1850-0617

Document [doc]
Download: doc | pdf

Responses to OMB Questions for TFS passback

SASS

 

What does NCES know about how accurately SASS schools report teachers' total experience and other items used for the SASS teacher listing operation? 

The TFS does not use the schools’ report of teachers’ years of experience in stratifying the TFS sample. Rather, the TFS uses the SASS teachers’ own report of the years of teaching experience. The SASS teacher listing operation is used to draw the SASS sample of teachers, not the TFS sample of teachers. The “teaching experience” variable on the listing form in SASS is used to draw a representative distribution of new and more experienced teachers.

 

What was the response rate for each teacher group in SASS? 

The 2007-08 SASS weighted unit response rates were:

Public school teachers 84.0%

Private school teachers 77.5%

BIA-funded school teachers 81.8%


 

Please also provide the results of the nonresponse bias analysis alluded to in the supporting statement.

The nonresponse bias analysis for the 2007-08 SASS teacher surveys is expected to be sent to NCES shortly, but in the meantime, here is the link for the results from the 2003-04 SASS:


http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2007337 (download only the main report, and the nonresponse bias analysis is in Chapter 6, pp. 96-112).


TFS

 

What does NCES know about the coverage of the various association membership lists used as a stratification variable?  Does NCES have information it could share about these to assist us in working with other ED research projects that sometimes propose to use these lists as survey frames?

For the private schools, the association membership asked in SASS is present only because SASS and PSS go out at the same time every other PSS data collection year; thus, SASS is collecting the PSS items on the private schools (rather than have 2 survey forms go to the same school). None of the association membership groups are used as a stratification group for private schools in SASS. Rather, it is religious affiliation that is used (the 11 strata groups listed in the TFS Supporting Statement, Part B).


The various association membership lists are collected as only one part of the list frame in the Private School Universe Survey (PSS). These lists have to be screened for overcoverage of “units” that do not meet the SASS definition of a school, such as any school that terminates with kindergarten or is primarily day care. In some cases, there may be undercoverage. Stephen Broughman is the contact for the PSS ([email protected] or 502-7315) and can provide details about how the PSS list frame is built; both state lists and membership lists are used, as well as a supplementary area frame. It is unlikely that private school association membership lists are going to be sufficient to serve as survey frames without some form of supplementation.

 

Please clarify whether a teacher who has completed a web survey is able to use his/her username and password to log in subsequently and see his/her previous responses.  If so, please confirm how NCES's security requirements are met by including both a username and password in the same unencrypted email.

Brian Taylor, IES Director of Technology who also serves as IES computer security officer and principal office IT coordinator, has agreed that the username and password may be sent in the same e-mail, since new data security rules have not yet taken effect. Respondents may log in as often as needed to complete the questionnaire; however, once they click on the submit button they will be locked out of the survey.


Please clarify why NCES believes that it will obtain better cooperation by sending hard copy questionnaires to callers rather than by interviewing them when they call in.

When a respondent calls in, every effort will be made to have the questionnaire completed by interview over the phone. Hard copy questionnaires will only sent to those respondents who refuse to complete the questionnaire with the interviewer.

 

Please describe the results of testing of response differences by mode.  How will these results likely affect the trend data from the last TFS administration to the next?  Please recall that last year's SASS clearance included the following explicit terms of clearance: "OMB looks forward to seeing a thorough plan to ensure that there is no break in series given anticipated data collection methodology changes."

.


In a February 2007 conversation with OMB for SASS clearance, an agreement with a previous OMB desk officer was negotiated that the 2008-09 TFS would be a good candidate for an Internet-based survey instrument. This was to keep moving forward towards the Paperwork Reduction Act’s goal of reducing paper-based data collections. The TFS, which has only one respondent but a number of skip patterns that may even change which questionnaire is appropriate, was considered to be a good candidate for an online instrument.

In designing the Internet instrument, the routing out of the first few items is entirely invisible to the respondent, so the respondent may not even know which questionnaire is being administered.

We designed the paper questionnaire to be comparable to the Internet version; however, due to splitting the TFS sample into a cross-sectional and a longitudinal group, it takes four paper questionnaires to handle the major skip patterns that are seamless in the Internet version. We conducted a thorough staff review of the comparability of the Internet questionnaire to the paper questionnaire. Staff developed numerous scenarios to test both modes, checking for conformance between the two. Because the survey has only one primary data collection mode (web-based), only a small number of the respondents are expected to be answering the questionnaire using a different mode (paper).

Mode effects occur more frequently for questions with vague quantifiers as answer options (e.g., “never a problem” vs. “sometimes a problem”) (Dillman 2007). These are frequently questions about subjective opinion and attitude. The longitudinal subset of the TFS teachers are asked these opinion questions (for example, about reasons for leaving teaching or moving to another school) for the first time in this administration. Hence, these questions are all administered in the same mode (web-based questionnaire), with the exception of a small number of respondents who will use the paper questionnaire in the 2008-09 TFS (future rounds will only include one mode—a web-based self-administered questionnaire). Questions that were asked in SASS using paper questionnaires and repeated in the longitudinal TFS using the web-based questionnaire, are more often the type of items that are less sensitive to mode changes. For example, main assignment, grade-level, class organization, and salary and other income items.

The Internet version has no long screens where respondents would need to scroll to view questions or answers


Our budget has had to absorb the Principal Follow-up Study and the Teacher Follow-up Longitudinal Study so we are trying to conduct the TFS as cost-effectively as possible.


Reference:

Dillman, D.A. (2007). Mail and Internet Surveys: The Tailored Design Method, 3rd edition. New York: Wiley and Sons.


Please clarify why the letters are proposed to go out under a staff signature rather than the Commissioner's, as is done for the FAQs as well as other NCES survey letters?

There are two reasons for sending the letters out under a staff member’s name rather than the commissioner’s. At the time the OMB package was submitted, it was not known who would replace Mark Schneider as commissioner. The current commissioner is acting commissioner and will likely not be here for the next round. The long-term plan for a longitudinal study of beginning teachers makes it advisable to have a consistent contact person, someone who is approachable so respondents feel they have someone they can contact at any time.


Since NCES does not currently have approval to follow up on any of these teachers an additional time, please clarify who will receive a "longitudinal" versus non-longitudinal questionnaire and how that determination is made.

All first-year public school teachers from the 2007-08 SASS will receive the longitudinal questionnaire. A sample of the teachers who had more than one year of teaching experience regardless of whether they taught in public or private schools will receive the national questionnaire.


When does NCES anticipate seeking approval for the Beginning Teacher Second Follow-up or whatever follow-up is currently envisioned?

We will send out the OMB package for Department of Education review in April. The package will be submitted to OMB on or around June 15, 2009.


Please work with Marilyn Seastrom to standardize the confidentiality statements made in the various letters and FAQs. Some use language that NCES has generally determined not to use since OMB guidance on CIPSEA was issued in 2007. The language in the "initial letter to sampled teachers" would be a good model for the other documents.

We will have Marilyn review the wording in the letters as well as in the FAQs.

The letters have been modified since the OMB package was submitted. The FAQs now contain the following statement. “Your responses are protected from disclosure by federal statute (P.L. 107-279, Title I, Part E, Sec. 183). All responses that relate to or describe identifiable characteristics of individuals may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, except as provided for in the Patriot Act (P.L. 107-056, Section 508).”


Please provide justification for each item in the background section of the questionnaires.


BACKGROUND INFORMATION FOR CURRENT AND FORMER TEACHER QUESTIONNAIRES


The background information collected in this section provides needed context about the respondent that could shed light on their perceptions of their previous school or job and on their decision to move to a new school or to leave the teaching profession in the future.


Work History-Q35

This question describes the work history of the teachers prior to becoming a teacher. It asks what kind of work they did and the number of years in that occupational field. These questions will help researchers to examine whether second-career teachers are more likely to leave the teaching profession than first-career teachers. Also, because actual careers are collected, researchers will be able to examine if trends exist in beginning career fields of second-career teachers.




Citizenship-Q36

Citizenship status is an important addition to TFS, since using foreign teachers is one way school districts resolve issues with teacher shortages (Hutchison and Jazzar 2007). Determining citizenship may also answer questions to the unique challenges international teachers face in working in cross-cultural environments and how they may hinder these individuals’ ability to become effective teachers (Hutchison and Jazzar 2007; Hutchison 2006).


Own/Rent Residence-Q37

While salary is an important component in understanding the attrition and migration of teachers, its impact can be mitigated by other factors, such as teachers’ personal financial burden. By collecting teachers’ living situations and painting a bigger picture of the teachers’ expenses, researchers will be able to conclude how important salary actually is in teacher turnover. Also, it might be easier for teachers who rent to move than for teachers who own their home.


Household Income-Q38

This question will also allow researchers to collect information on the teacher’s financial burden. This question along with teacher salary could be used to understand how much supplemental income the teacher is receiving. Researchers will then be able to determine how important salary actually is in teacher retention.


Marital status-Q39

Because marital status was not collected in the 2007-08 SASS, respondents are asked about their current status, if there has been a change in status since the 2007-08 SASS, and if there has been a change, they are asked to report the type of change. Research indicates that young women exit teaching sooner than men and older women, but also have the highest return rate following their first career interruption (Murnane, Singer, and Willett 1988). This pattern suggests that marriage and childrearing are important factors shaping the early career patterns of female teachers. However, these data are from the early 1980s and remains to be seen whether changes in marital status still impact female teachers’ mobility the same way.


Dependents-Q40

This question was also designed to measure teachers’ financial burden. In combination with salary and household income, researchers will be able to understand the personal burden on teachers and how it might relate to teacher attrition and retention.

 

Please also clarify which questions are new for this administration, as well as a justification for them.


QUESTIONNAIRE FOR FORMER TEACHERS –new items


Current Main Occupation-Q6

In trying to understand the reasons for teacher attrition, it is crucial to know for what types of jobs teachers leave the teaching profession or if they remain in the K-12 education field. This question will document whether former teachers eventually move to become principals or hold other positions within the K-12 education field, and whether or not those positions are in the public or private sector.


Contract Renewal-Q11

This question allows researchers to establish whether a teacher left teaching voluntarily or not. It allows researchers to estimate how much attrition is due to teachers not being able to meet the “highly qualified” requirement in the NCLB law. Furthermore, this question (combined with the year/month began teaching question) allows researchers to examine whether new teachers who are more susceptible to reductions in labor force or district organizational changes because of their lack of seniority (Elfers, Plecki, and Knapp 2006) are more likely to leave the teaching profession or change schools than teachers with more experience.


Apply For Teaching Position-Q14

This question allows researchers to examine whether former teachers are trying to return to the teaching field, and if not, what are the reasons. This “reserve pool” created by teachers who have left the profession and later return to teaching has been given far less attention than the attrition and entrance of new teachers. Focusing on attrition without adequately accounting for former teachers who return can overstate the loss of teachers in the supply pool and distort the progression of teaching careers.


Year/Month Began Teaching-Q17

This question establishes the total length of teaching experience which allows researchers to explore the relationship between the length of a teaching career and attrition. For example, examining teachers at the start of their teaching career will allow researchers to better understand how certain factors impact teaching career paths. Attrition is highest among new teachers—namely, those in their first 3 years of teaching—and older teachers reaching retirement (Ingersoll 2001; Kelly 2004; Marvel, et al. 2006). Because retirement is an expected part of a career trajectory, attention has focused on the high attrition rate among new teachers. As these teachers gain 4 or 5 years of teaching experience, the likelihood that they will leave the profession decreases (Boe et. al. 1997; Hanushek, Kain, and Rivkin 2004; Kirby, Berends, and Naftel 1999; Singer and Willett 1988). Because new teachers have a high rate of attrition, capturing them in their first year of teaching and tracking them for potentially 10 years may reveal motivations and career patterns that are not easily identified with cross-sectional data.




Mentoring-Q18-20

The questions on mentoring are designed to allow researchers more insight into the types of induction support teachers are receiving, and their perceptions on its effectiveness. Studies have shown that mentoring programs are important in retaining teachers. For example, the results from a study by Smith and Ingersoll (2004) indicate that induction and mentoring programs within a teacher’s first year of teaching do deter teachers from switching schools and leaving teaching all together. They also found that teachers who had received more types of support, including having a mentor in their same field, were more likely to remain in teaching. On the other hand, a randomized experiment that was conducted in approximately 400 elementary schools in 17 states during the 2005-06 school year did not detect any differences in the impact of an intensive, structured induction program on teacher attrition (Glazerman et al 2008). While the study found no differences in attrition rates approximately a year after the beginning of the program, a longer longitudinal study would be needed to establish whether the induction program might make a difference in the following years.


Because the content of the mentoring and induction programs varies widely, the inconsistent results of different studies may be due to differences in how mentoring is measured. Howard Nelson and Michael Strong, academics who study mentoring, expressed frustration in the TFS expert meeting in 2007 about the lack of details in mentoring questions. They indicated that it is difficult to estimate whether mentoring programs impact teacher retention because most datasets on mentoring do not contain detailed enough information about the content of the program.


Alternative Certification-Q21-25

It is well known that alternative certification programs vary widely in their content and length (Humphrey and Wechsler 2007). The items on alternative certification programs are designed to identify key differences among the programs and the perceived effectiveness of the program given that research in this area indicates that this route encompasses a diverse grouping of programs (Humphrey and Wechsler 2007). These questions will allow researchers to examine whether teachers who have entered teaching through an alternative certification program differ in their attrition rates from teachers who received their training in more traditional programs.



QUESTIONNAIRE FOR CURRENT TEACHERS-new items


Highly Qualified Teacher-Q6

Determining highly qualified teacher status each year will help researchers identify those teachers who leave or are eventually pushed out of teaching because they are not highly qualified. It will also help identify those who are qualified teachers, but leave teaching to pursue a different school or career. Lankford, Loeb, and Wyckoff (2002), found that more qualified teachers have higher attrition and mobility rates than those that were not as qualified.


Year/Month Began Teaching-Q7

This question establishes the total length of teaching experience that allows researchers to explore the relationship between the length of a teaching career and attrition. For example, examining teachers at the start of their teaching career will allow researchers to better understand how certain factors impact teaching career paths. Attrition is highest among new teachers—namely, those in their first 3 years of teaching—and older teachers reaching retirement (Ingersoll 2001; Kelly 2004; Marvel, et al. 2006). Because retirement is an expected part of a career trajectory, attention has focused on the high attrition rate among new teachers. As these teachers gain 4 or 5 years of teaching experience, the likelihood that they will leave the profession decreases (Boe et. al. 1997; Hanushek, Kain, and Rivkin 2004; Kirby, Berends, and Naftel 1999; Singer and Willett 1988). Because new teachers have a high rate of attrition, capturing them in their first year of teaching and tracking them for potentially 10 years may reveal motivations and career patterns that are not easily identified with cross-sectional data.


Mentoring-Q8-10; Q27-28

The questions on mentoring are designed to allow researchers more insight into the types of induction support teachers are receiving, and their perceptions on its effectiveness. Studies have shown that mentoring programs are important in retaining teachers. For example, the results from a study by Smith and Ingersoll (2004) indicate that induction and mentoring programs within a teachers first year of teaching do deter teachers from switching schools and leaving teaching all together. They also found that teachers who had who had more different types of support, including having a mentor in their same field, were more likely to remain in teaching. On the other hand, a randomized experiment that was conducted in approximately 400 elementary schools in 17 states during the 2005-06 school year did not detect any differences in the impact of an intensive, structured induction program on teacher attrition (Glazerman et al., 2008). While the study found no differences in attrition rates approximately a year after the beginning of the program, a longer longitudinal study would be needed to establish whether the induction program might make a difference in the following years.


Because the content of the mentoring and induction programs varies widely, the inconsistent findings may be due to differences in how mentoring is measured. Howard Nelson and Michael Strong, researchers who study the issue of mentoring, expressed frustration in the TFS expert meeting in 2007 about the lack of detail in mentoring questions. They indicated that it is difficult to estimate whether mentoring programs have impact on mentoring on teacher retention because most data on mentoring do not contained detailed enough information about the content of the program.


Alternative Certification-Q11-15

It is well known that alternative certification programs vary widely in their content and length (Humphrey and Wechsler 2007). The items on alternative certification programs are designed to identify key differences among the programs and the perceived effectiveness of the program given that research in this area indicates that this route encompasses a diverse grouping of programs (Humphrey and Wechsler 2007). These questions will allow researchers to examine whether teachers who have entered teaching through an alternative certification program differ in their attrition rates from teachers who received their training in more traditional programs.


Contract Renewal-Q21

This question allows researchers to establish whether a teacher left teaching voluntarily or not. It allows researchers to estimate how much attrition is due to teachers not being able to meet the “highly qualified” requirement in the NCLB law. Furthermore, this question (combined with the year/month began teaching question) allows researchers to examine whether new teachers who are more susceptible to reductions in labor force or district organizational changes because of their lack of seniority (Elfers, Plecki, and Knapp 2006) are more likely to leave the teaching profession or change schools than teachers with more experience.




Principal/School Head Status-Q24

Job dissatisfaction is cited by some researchers as the most important reason for teacher turnover, and the most important causes of job dissatisfaction are a lack of supportive and effective school administrators, student discipline problems, low salaries, and a lack of decision-making power in the school (Ingersoll 2001; Stockard and Lehman 2004). By determining if there has been a change in administration of the school, and combining this with the items in the “school factors” section of the reasons for leaving item, researchers will be able to capture whether changes in the school have an impact on teacher turnover.


Satisfaction-Q26

This question will determine whether teacher satisfaction has increased, decreased, or remained the same over time. Job dissatisfaction is the most important reason for teacher turnover (Ingersoll 2001; Stockard and Lehman 2004).


References

Boe, E. E., Bobbitt, S. A., Cook, L., Whitener, S. D., and Weber, A. L. (1997). Why Didst Though Go? Predictors of Retention, Transfer, and Attrition of Special and General Education Teachers from a National Perspective. The Journal of Special Education, 30: 390-411.

Elfers, A.M., Plecki, M.L, and Knapp, M.S. (2006) Teacher Mobility: Looking More Closely at “The Movers” Within a State System. Peabody Journal of Education, 81(3): 94-127.

Glazerman, S., Dolfin, S., Bleeker, M., Johnson, A, Isenberg, E., Lugo-Gil, J., Grider, M., and Britton, E. (2008). Impacts of Comprehensive Teacher Induction: Results From the First Year of a Randomized Controlled Study (NCEE 2009-4034). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Hanushek, E. A., Kain, J. F., and Rivkin, S. G. (2004). Why Public Schools Lose Teachers. The Journal of Human Resources, 39(2): 326-354.

Humphrey, D.C. and Wechsler, M.E. (2007) Insights Into Alternative Certification: Initial Findings from a National Study. Teachers College Record, 109(3): 483-530.

Hutchison, C.B. (2006) Cultural Constructivism: the Confluence of Cognition, Knowledge Creation, Multiculturalism, and Teaching. Intercultural Education, 17(3): 301-310.

Hutchison, C.B. and Jazzar, M. (2007) Mentors for Teachers From Outside the U.S. Phi Delta Kappan, 88(5): 368-373.

Ingersoll, R. M. (2001) Teacher Turnover and Teacher Shortages: An Organizational Analysis. American Educational Research Journal, 38(3): 499-534.

Kelly, S. (2004) An Event History Analysis of Teacher Attrition: Salary, Teacher Tracking, and Socially Disadvantaged Schools. The Journal of Experimental Education, 72(3): 195-220.

Kirby, S., Berends, M., and Naftel, S. (1999). Supply and Demand of Minority Teachers in Texas: Problems and Prospects. Educational Evaluation and Policy Analysis, 21(1): 47-66.

Lankford, H. Loeb, S. and Wyckoff, J. (2002) Teacher Sorting and the Plight of Urban Schools: A Descriptive Analysis. Education Evaluation and Policy Analysis, 24(1): 37–62.

Marvel, J., Lyter, D.M., Peltola, P., Strizek G.A., and Morton, B.A. (2006). Teacher Attrition and Mobility: Results from the 2004-05 Teacher Follow-up Survey (NCES 2007-307). U.S. Department of Education, National Center for Education Statistics. Washington, DC: US. Government Printing Office.

Murnane, R. J., Singer, J. D., and Willet, J. B. (1988) The Career Paths of Teachers: Implications for Teacher Supply and Methodological Lessons for Research. Educational Researcher, 17(6): 22-30.

Singer, J. D. and Willett, J. B. (1988). Detecting Involuntary Layoffs in Teacher Survival Data:

The Year of Leaving Dangerously. Educational Evaluation and Policy Analysis, 10: 212-224.

Smith, T. and Ingersoll, R. M. (2004) Reducing Teacher Turnover: What are the Components of Effective Induction? American Educational Research Journal, 41(3): 687-714.

Stockard, J. and Lehman, M. (2004) Influences on the Satisfaction and Retention of 1st-Year Teachers: The Importance of Effective School Management. Educational Administration Quarterly, 40(5): 742-771.


Why do the questionnaires ask all respondents to keep track of the amount of time it took to complete the questionnaire?

Respondents estimate of the time it takes them to complete the survey aids in the calculation of future time estimates for the OMB response burden hours.

 

Is it true in all cases that individuals responding to "current assignments" questions a certain way will definitely be sent a different form to complete (as the instructions indicate) or will some simply be out of scope with no further follow up?

Respondents who complete the questionnaire using the Internet version will be seamlessly directed to the appropriate questionnaire. Those who indicate by mail that they need to complete a version other than what they received will be mailed the appropriate questionnaire at the time of request.

 

What does NCES know about the accuracy of teacher reports of "base" teaching salary?  Is this intended to be "before taxes" as the item following it is? 

The section in the 2007-08 SASS Teacher questionnaires on salaries and earnings begins with the instruction “The following questions refer to your before-tax earnings from teaching and other employment.” So, yes, base academic year teaching salary, along with all of the other earnings, is intended as the before-tax amount. In 2005-06, in preparation for the 2007-08 SASS, a pilot study was initiated to link the teachers in a methodological test to their district records for a direct comparison of teacher salary and other compensation items.


The following is an extract from the pilot study’s findings which will be documented as an appendix in the 2007-08 SASS Documentation report:


Comparing the Teacher Compensation Pilot Study (TCPS) with SASS, this study attempts to assess self-reported teachers’ income and the accuracy of employment status data. Two data collection methods are compared, administrative records of public school teachers provided by district personnel and self-reported public school teachers’ data.   Teachers’ administrative data were collected between May and June 2006 from school district respondents using TCPS questionnaires.  The collection of teachers’ self-reported income was done during the fall and spring of 2006 using SASS public school teacher questionnaires.  The TCPS sample was purposely designed to partially overlap the SASS teacher sample. Administrative records are assumed to be the most accurate and complete because they eliminate reporting bias that may come from individuals’ self-reports.

 

For the first component, public school teachers were asked to complete the SASS Teacher Questionnaire, providing self-reported wages from school employment. The second component was aimed at gathering income-related data from district-level administrative records.

 

Findings:

 

Overall, teachers’ base salaries as provided in self-reported and administrative records data appear to have similar distributional patterns.  However, self-reported compensation amounts are more likely to be lower than those reported in administrative records. Disparities between the two data sources are larger among forms of compensation other than base salary.  These results are consistent with previous studies showing greater consistency between self-reported and administrative records data in the reporting of teachers’ base salary than other salary components.


Table 2. Median Salary Values: 2005-06 Administrative Record and Self-Reported Salary, Matched Observations

Salary Component

Number of matched records

Administrative record salary

Self-reported salary

Percent difference

Base salary

1,246

$42,983

$41,107

4%

Additional compensation

363

$2,328

$2,000

14%

Other school sources

123

$2,334

$1,410

40%

Summer school

37

$2,008

$2,000

0%

Total salary1

1,091

$45,285

$43,000

5%


Why are you asking respondents to provide current address information if you just reached them by mail?

The Internet version of the questionnaire has the contact information pre-loaded and respondents will be asked to verify. The paper version of the questionnaire could have been forwarded through the mail, or the respondent may desire to submit different contact information.

9


File Typeapplication/msword
File TitleResponses to OMB Questions for TFS passback
Authorfreddie.cross
Last Modified ByKerry.Gruber
File Modified2009-01-28
File Created2009-01-28

© 2024 OMB.report | Privacy Policy