Att_BTLS 2009-2012 Supporting Statement Part B

Att_BTLS 2009-2012 Supporting Statement Part B.doc

Beginning Teacher Longitudinal Study (BTLS) 2009-2012

OMB: 1850-0868

Document [doc]
Download: doc | pdf

PART B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Respondent Universe


Beginning Teacher Longitudinal Study (BTLS) Sample Design.

The BTLS sample consists of a subset of the SASS 08 teacher sample (see below). All SASS 08 respondents who were first-year public school teachers in the 2007-08 school year (i.e., they reported that the first year of teaching was 2007 or 2008 with SASS first) will be included in the BTLS sample. All of these teachers were also included in the TFS. There are about 2,100 such teachers.


Background on the SASS 08 sample of public schools and public school teachers.

The SASS sample was a stratified sample. The schools were sampled first and then teachers were selected within each sampled school. Schools were first classified by school type (public/private). Public schools were stratified by the 50 States and the District of Columbia and then by three grade levels (elementary/secondary/combined). Sampled public schools were selected with a probability proportional to size (PPS), which is the square root of the number of teachers in a school within each of the 153 public school strata.


NCES asked the sampled schools to provide a list of all teachers teaching in the school and the following information for each teacher on the list:


  • Whether the teacher’s total teaching experience was 3 years or less, 4 – 19 years, or 20 or more years;

  • Main subject taught (general elementary, special education, math, science, English/Language arts, social studies, vocational/technical, or other subjects);

  • Full-time or part-time teaching status at the school; and

  • Whether it was likely that the teacher would be teaching at the school for the next school year (2008-09).


The above information for each teacher in a selected SASS school comprised the teacher listing operation or the teacher sampling frame.


Within each selected school, teachers were classified as belonging in one of the following five categories:


  1. New stayer (3 or fewer years in the teaching profession) and likely to be teaching at the same school;

  2. Not new stayer (4 or more years in the teaching profession) and likely to be teaching at the same school;

  3. New leaver/mover (3 or fewer years in the teaching profession) and not likely to be teaching at the same school;

  4. Mid career leaver/mover (4 to 19 years in the teaching profession) and not likely to be teaching at the same school; and

  5. Late career leaver/mover (20 or more years in the teaching profession) and not likely to be teaching at the same school.


Teachers not expected to be teaching at the same school (groups 3, 4, and 5) were oversampled to achieve a sufficient sample size (1,200 public) to support national estimates and to achieve design goals for the TFS.


Within each teacher stratum, teachers were sorted by teacher’s main subject taught (as reported by the principal on the SASS Teacher Listing Form). This method was used to assure a good distribution of teachers by main subject taught.


Within each school and teacher stratum, teachers were selected systematically with equal probability, meaning that each teacher within each stratum and school was given an equal chance of selection.


The result was 48,400 teachers in the SASS 08 sample.


2. Procedures for Collection of Information


In the SASS 08 and the TFS 09, teachers were asked to provide the following information:


  • Name

  • Spouse’s name

  • Home address

  • Home telephone number and in whose name it’s listed

  • Most convenient time to reach her/him

  • Work e-mail address

  • Home e-mail address

  • Similar information for two other people who would know where to get in touch with her/him.

Beginning in October 2009, Census Bureau staff will conduct research in order to obtain contact information for teachers who provided incomplete or no contact information in the either the SASS or the TFS interview.

Data collection will begin in January 2010. The first contact with sampled teachers will be a “recruitment mailing” for the survey. The mailing will include a letter asking them to be a part of the survey panel and a brochure detailing the project and explaining its relation to SASS and TFS. The first contact mailing will be followed by both a mailed and e-mailed letter. Mailed letters will be sent to all individuals in the sample whose first mailing was not returned by the post office. Each mailed letter will be customized and will provide the sampled teacher with a username and password to access the web-based instrument along with a $20 incentive. Each e-mail will be similar except that two e-mails will be sent; one with the username, one with the password. The letter will also explain the purpose of the survey, include a statement of authority, and will discuss NCES’ policy on protecting personal information. About two weeks later, NCES will send a second reminder e-mail to nonrespondents. Throughout data collection, research will be conducted, as needed, to find current addresses and e-mails for sampled teachers whose letters and e-mails are returned as undeliverable.

In March 2010, Census Bureau staff will call those teachers who have not yet responded to the web-based collection. NCES will encourage self-administered web-based participation; however, telephone interviewers will be able to log into the web-based instrument and use it much like a CATI (computer-assisted-telephone-interview) instrument.


A response rate of 90 percent is expected in this round of BTLS.


Nonresponse Bias Issue.

NCES designed the Beginning Teacher Longitudinal Study from its inception to sample only from the respondents to SASS. Given the sample design for the 2007-08 SASS, which does not have a separate stratum of first-year teachers only, it was not possible to sample teachers for BTLS from the pool of all teachers initially selected for SASS rather than only from those who responded to SASS. By not sampling teachers from the nonresponse stratum, potential biases may be introduced into BTLS. When considering unit nonresponse for the previous administration of SASS (2003-04), there was no evidence to point to a substantial bias in teacher response rates in SASS estimates. A contractor will conducted analysis on the data gathered from the beginning public school teachers in SASS 08 to look for potential nonresponse bias. This analysis is expected to be completed by the end of June, 2009 and will be shared with OMB when it’s available.


SASS nonrespondents cannot be contacted for the BTLS because it would be impossible to tell which teachers were first year teachers in 2007-08 without the SASS data. Logistically, trying to combine SASS interviewing with BTLS interviewing would slow down the field data collection because it would probably take longer to locate and interview the nonresponding SASS cases, and analyze the data. For these reasons, BTLS teachers will be sampled from SASS respondents only.


3. Methods for Maximizing Response Rates


NCES and the Census Bureau will employ a variety of procedures to ensure high response rates at both the level of the responding unit (i.e., sample member) and at the level of the individual survey items in the survey questionnaire.


The entire survey process, starting with securing research cooperation from key public school groups and individual sample members and continuing throughout the data collection is designed to increase survey response rates. In addition, NCES believes that the following seven elements of the data collection plan, in particular, will contribute to the overall success of the survey and will enhance the response rates:


1. Endorsements from key public school groups. The level of interest and cooperation demonstrated by key groups can greatly influence the degree of participation of survey respondents. Endorsements are viewed as a critical factor in soliciting cooperation from state and local education officials for obtaining high participation rates in the public sector. The contractor will seek endorsements from the same and comparable groups as those that endorsed the SASS. Endorsement will be sought from the following organizations:


American Federation of Teachers

Council of Chief State School Officers

National Council of Teachers of English

National Council of Teachers of Mathematics

National Science Teachers Association

National Education Association

National Retired Teachers’ Association Educator Support Network

New Teachers’ Network

Project on the Next Generation of Teachers

Social Science Education Consortium

Teachers’ Network

Teacher Support Network


2. Stressing the importance of the survey and the respondents' participation in it. Official letters (advance and follow-up) from our NCES Commissioner will motivate respondents to return their surveys. Knowledge of support by various respected organizations is intended to increase the respondent’s perception of the importance of the survey.


3. Extensive follow-up (by mail, e-mail, and telephone) of non-respondents. The Census Bureau will use a variety of techniques to increase response levels to BTLS, including:


(a) Two survey modes—web-based instrument with telephone follow-up as needed, to achieve high response levels.


(b) Allocating adequate time and resources to respondent tracking efforts to ensure that a high percentage of movers and leavers are successfully located and surveyed. The Census Bureau will employ a variety of techniques to locate survey respondents. Potential tracking sources include: (a) names, addresses, and phone numbers for respondents’ spouses and two other friends or relatives (if the respondents listed this information in the completed SASS teacher survey or the TFS); (b) leads provided by school principals or their designees; (c) directory assistance and telephone directories; and (d) post office for possible forwarding addresses.


4. Proper questionnaire design. Proper questionnaire design techniques will be employed to minimize item non-response. The Census Bureau carefully analyzed all completed questionnaires from the 2004-05 survey to determine which items had the highest levels of item nonresponse. This analysis guided NCES in enhancing the clarity of item wording, definitions, and instructions for TFS 09. NCES will employ the same process with the TFS 09 survey, thus allowing for further modifications to the BTLS. Furthermore, a contractor will conduct cognitive interviews on new questions intended for returning teachers and will analyze the results to ensure clarity of the questions.


The BTLS will be a completely online instrument due to the complicated skip patterns. As the survey progresses to future waves, the skip patterns will become more complicated each year. Using a web-based survey instrument will decrease the burden for respondents by not requiring them to wade through questions that do not pertain to them.


5. Use of incentives. An incentive of $20 will be provided with the letter that provides the username, password, and survey link.


6. Personalization. The additional personalization of survey materials (e.g., cover letters and e-mails with teachers’ names) is expected to have a positive effect on the response rates.


4. Tests of Procedures and Methods


The 1988-89 Teacher Follow-up Survey was field tested in 1987-88. Since then, TFS has been conducted on a full-scale basis five times: in 1988-89, 1991-92, 1994-95, 2000-01, and 2004-05. The results from those experiences have been used to clarify and revise questions for the BTLS.


After the 1994-95 TFS, the Census Bureau implemented an extensive reinterview and reconciliation program. NCES used the results of this reinterview to revise the questions to minimize response error and improve the flow of the questionnaire. For the 2004-05 TFS, NCES revised or eliminated several items based on results from cognitive interviews. NCES tested, revised, and then retested items with current and former teachers to determine whether the items were clearly and uniformly understood. In addition, a Census Bureau subcontractor conducted cognitive interviews on new and modified items for the TFS 09. NCES will include several new items in the BTLS and these have undergone a Questionnaire Review Board examination and are being tested in cognitive interviews by a Census Bureau subcontractor.


The Census Bureau conducted an Internet and incentive experiment during the 2004–05 TFS (Cox, Parmer, Tourkin, Warner, and Lyter, 2007). The goal was to use monetary incentives to increase overall response rates and responses via the Internet, when both mail and Internet choices were offered.


5. Reviewing Statisticians


Dennis Schwanz (301-763-1984) and Randy Parmer (301-763-3567) of the Census Bureau reviewed and approved the BTLS sample design and related matters for statistical quality, feasibility, and suitability to the overall objectives of the survey

21


File Typeapplication/msword
AuthorAuthorised User
Last Modified ByAuthorised User
File Modified2009-06-26
File Created2009-06-26

© 2024 OMB.report | Privacy Policy