Field Test Report

Att_NHES 2006 field test report.doc

2007 National Household Education Surveys Program (NHES: 2007)

Field Test Report

OMB: 1850-0768

Document [doc]
Download: doc | pdf

F ield Test of the 2007 National Household Education Surveys Program



Phase 1 & 2: April–July 2006


















August 22, 2006



Table of Contents

Section Page


Introduction 1


NHES:2007 Surveys 1

NHES:2007 Field Test 2



Field Test Design and Procedures 5


Field Test Samples 5

Interviewer Training 6

Field Test Data Collection Procedures 6

Interviewer Debriefing 10

Data Review 10

Completed Interviews 10


Findings and Study Revisions 15


Interview Administration Times 15


Screener Administration Time 15

SR and PFI Interview Administration Times 15

AEWR Interview Administration Time 17


Field Test Instrument Evaluation and Results 19


Screener 20

AEWR Survey 21

SR and PFI Surveys 25

Bias Study Findings 45


Reference 47





LIST OF TableS

Table Page


1. Mean number of contact attempts in the Telephone Research Center prior to
in-person data collection, by telephone result: 2006 8


2. Final in-person follow-up results by final telephone research center result
code: 2006 12


  1. Mean number of in-person contact attempts by final field result status:
    2006. 12


  1. Interview administration times for the School Readiness (SR) and Parent
    and Family Involvement in Education (PFI) interviews in phase one of
    the field test: 2006 15


  1. Interview administration times for the School Readiness (SR) and Parent
    and Family Involvement in Education (PFI) interviews in phase two of the
    field test: 2006 16


  1. Interview administration times for the Parent and Family Involvement
    in Education (PFI) interview in phase two of the field test, for children
    in kindergarten through second grade: 2006 16


  1. Interview administration times for the and Parent and Family Involvement
    in Education (PFI) interview, by first or second interview in the household;
    in phase two of the field test: 2006 17


  1. Interview administration times for the Adult Education for Work-Related
    Reasons interview in phase one of the field test: 2006 18


  1. Interview administration times for the Adult Education for Work-Related
    Reasons interview in phase two of the field test: 2006 18





Introduction

The National Household Education Surveys Program (NHES) was developed by the National Center for Education Statistics (NCES) to collect information on important educational issues through random digit dial (RDD) telephone surveys of households in the United States. NHES was developed by NCES to complement its institutional surveys. This program is the principal mechanism for addressing topics that cannot be addressed in institutional data collections. By collecting data directly from households, NHES enables NCES to gather data on a wide range of issues, such as early childhood care and education, children’s readiness for school, homeschooling and school choice, before- and after-school activities of school-age children, participation in adult and continuing education, parental involvement in education, and civic involvement. NHES uses RDD and computer-assisted telephone interviews (CATI) and has been conducted by Westat approximately every other year from 1991 through 2005. NHES:2007 will be the ninth NHES administration. NHES:2007 will also include a special study to evaluate potential nonresponse bias in NHES estimates.



NHES:2007 Surveys


NHES:2007 includes three surveys: The School Readiness Survey (SR), the Parent and Family Involvement in Education Survey (PFI), and the Adult Education for Work-Related Reasons Survey (AEWR). These three surveys are repeated administrations of topics addressed in prior NHES collections, and will provide current cross-sectional, national estimates of education experiences as well as provide for the measurement of change over time.


Experts in the field of school readiness are interested in the educational and behavioral development of preschool children, and how parental/family involvement helps prepare children for learning experiences as they enter school. Likewise, experts in parent and family involvement in education are interested in preschoolers as well as school-aged children, with a particular focus on how involvement by parents and other family members support and enhance children’s learning and education. Due to the overlap in the populations of interest and the measures of interest, the SR and PFI surveys share a single instrument, with specific paths and items designated for children of various ages.


School Readiness Survey. SR takes a broad approach to collecting information on the school readiness and early school experiences of young children, focusing on the measures suitable for collection in a survey of parents. Data on parent reports of children’s developmental status, center-based program participation and preschool enrollment, parent beliefs about school readiness, family-child learning activities, measures of health and disability, and child and family characteristics will provide a rich source of data for multi-faceted analysis.


Parent and Family Involvement in Education Survey. PFI examines school choice and parent and family involvement in their children’s schools, homework, and educational activities at home. A special set of questions is included for parents who homeschool their children. The field test versions of the instrument included items concerning the role of nonresidential parents’ involvement in educational activities, but these items were later eliminated due to time constraints.


Adult Education for Work-Related Reasons Survey. AEWR addresses a range of adult educational activities taken for work-related reasons, including college or university degree or certificate programs; vocational or technical diploma, degree, or certificate programs; apprenticeships; formal work-related courses from a variety of sources; and informal learning related to a job or career. As in the past, information about instructional providers, intensity of participation, reasons for participating, outcomes of participation, and forms of employer support will be collected. Also, information will be gathered on distance learning through various technologies.



NHES:2007 Field Test


The NHES:2007 field test was designed to be conducted in two phases. Phase one of the NHES:2007 field test had three purposes. The first goal was to qualitatively assess the NHES:2007 survey questionnaires by monitoring telephone interviews and debriefing the telephone interviewers. This assessment of the instrument focused on interview flow, how the interviews sounded in “live” administration with respondents, respondent comprehension, and the operation of the CATI system. A second goal was to obtain interview administration timings from the CATI system for the SR, PFI, and AEWR interviews. The SR and PFI interviews were of particular concern based on preliminary timings conducted by and with in-house Westat staff that showed the instruments took too long to administer.  Expected administration times were around 20 minutes.  Preliminary interviews, however, took over 30 minutes. In order to meet these two goals, 50 completed interviews of each type (SR, PFI, and AEWR) were considered sufficient for phase one of the field test.


A third goal of phase one of the field test was the implementation and evaluation of the planned field procedures for the NHES:2007 bias study. This component of the main study will examine potential nonresponse bias in NHES estimates by conducting in-person followup with cases not completed in telephone interviewing. While many of the bias study procedures were tested in NHES:2005 feasibility studies, the procedures developed for NHES:2007 based on those earlier experiences required a final evaluation. This component of the field test included the full telephone and field protocols planned for NHES:2007, including survey mailings and incentives, refusal conversion, and in-person followup (bias study procedures are discussed further on pages 7 through 9). In addition, the field test afforded another opportunity to observe the effect of in-person followup on households that have refused by telephone or have not responded after many call attempts; a concern was that some households would be angered by the in-person followup or perceive that they were being harassed. The bias study portion of the field test was conducted in one county in the mid-Atlantic region, and provided an opportunity to identify areas in which the field procedures should be adjusted prior to the full implementation of the bias study in 2007.


Phase two of the field test shared the same two goals as the RDD portion of the phase one field test: instrument evaluation and the assessment of survey timings. By administering larger numbers of interviews (200 each for SR, PFI, and AEWR), further qualitative assessment of interview flow, respondent comprehension, and the operation of the CATI system would be possible. Quantitative review of the survey data was an additional goal of the phase two field test. By examining item distributions and “other, specify” responses, survey managers would be able to identify items with low variability, high nonresponse rates, and high numbers of “other” responses that might suggest the need for additional response categories.


The field test schedule was as follows:


Phase one telephone interviewing March 17-April 24, 2006

Phase one bias study field followup April 30-June 14, 2006

Phase two telephone interviewing May 26-July 2, 2006


This page is intentionally blank.


Field Test Design and Procedures

The sections below describe the samples selected for the two phases of the field test, interviewer training, and data collection procedures.



Field Test Samples


T wo samples were selected to meet the field test goals. First, a random digit dial sample was selected that was sufficient to meet the target number of interviews for SR, PFI, and AEWR. Because the goals of the field test involved assessment of the instruments and survey administration times, and not estimation to the population, some deviations from normal random digit dial sample selection were implemented. The telephone numbers were selected in the Eastern and Central time zones, only telephone numbers identified by the vendor as residential numbers were selected, and the sample was selected so that approximately two-thirds of the sampled telephone numbers were those flagged as likely containing at least one household member under the age of 15. These changes were implemented to improve the operational efficiency of the field test. Households selected in this manner are not likely to be different from the population in ways that would affect the results of the field test in terms of evaluating the working of the instruments or the survey timings. A total of 7,000 telephone numbers was selected, with 2,000 of these being allocated to the phase one field test; the remaining 5,000 telephone numbers were reserved for phase two.


In addition to the selection criteria above for the field test sample of telephone numbers, the assignment of the cases for within-household sampling was also adjusted to improve operational efficiency. While some households without children will not be screened for possible AEWR sampling in the main study, the field test sample did not include this condition. Rather, adults were enumerated for possible sampling in all households, with lower probabilities of selection assigned to adults in those households with eligible children to limit intra-household response burden.


The second sample, selected for the phase one field test only, was an address sample in the county selected for the test of bias study procedures. Within the county, 10 local segments were selected to improve the operational efficiency of field efforts; this will also be a feature of the main study. A sample of 400 addresses was selected from residential postal delivery data files; the selected addresses were then matched to telephone numbers using a commercial vendor. In a second phase of sampling, the cases with telephone matches and those without telephone matches were subsampled at differential rates to arrive at a final bias study sample composed of 80 percent matched cases and 20 percent nonmatched cases. The final bias study field test sample contained 120 addresses with telephone number matches and 30 addresses without telephone matches.



Interviewer Training


Seven experienced telephone interviewers were trained for the phase one field test. The training focused on the sponsorship and purpose of NHES:2007 and the administration of the CATI interviews. Lectures and interactive interview scripts were used in the training session. In addition, interviewers were provided with lists of common respondent questions and suggested answers to assist them. The training session was 4 hours in length.


Three field staff were initially trained for the phase one bias study field test. The 8-hour training session included a discussion of the purpose and content of NHES:2007, lectures on field procedures and working with the Telephone Research Center (TRC) to complete interviews, lectures and exercises on the administration of the Household Folder, exercises in addressing respondent questions, and practice in using the cellular telephones provided to them for the field test. During the last week of data collection for the bias study, one additional experienced field interviewer was trained to complete no-in-person-contact cases and TRC noncontact or nonworking cases with one mild in-person refusal.


Twenty-four telephone interviewers were trained for the phase two field test. Among them were five interviewers who worked on the phase one field test. The training program included the same elements–an introduction to the NHES program and the specific surveys; interactive scripts demonstrating the Screener, SR, PFI, and AEWR interviews; and a review of common respondent questions and suggested answers. The training session was four hours in length.



Field Test Data Collection Procedures


Telephone interviewers attempted to contact all sampled telephone numbers (i.e., those in the RDD sample and those in the address sample with matched telephone numbers), secure their cooperation, and administer the NHES:2007 interviews. Different data collection procedures were used for the RDD samples in phases one and two and the bias study address sample in phase one.


For the RDD samples in phase one and phase two, the goal was to dial the sampled telephone numbers and complete the target numbers of SR, PFI, and AEWR interviews. Callbacks were made to telephone numbers at which no contact was made or the household agreed to an appointment. No advance mailings were sent to these households. No refusal conversion was attempted with the RDD sample in phase one. In phase two, because the cooperation rate was low (see page 13), refusal conversion was attempted with initial screener refusals that were coded as mild to increase the number of completed interviews.


The bias study portion of the phase one field test included the procedures developed for the full-scale implementation of the study. The protocol was based on experience from past NHES collections and other recent survey experience regarding the efficacy of survey mailings and cash incentives and the benefits of relatively high call limits.


The telephone data collection protocol was as follows:

  • An advance mailing of a letter on Department of Education stationery that described the sponsorship, purpose, and importance of the study and requested participation. The letter gave commonly asked questions and their answers on the back. An incentive of $2 in cash was enclosed, and the mailing was sent first class in a Department of Education envelope.

  • Initially, seven call attempts were made to reach a household member, with calls placed at different times of the day and on different days of the week. Once contact was made with a household member, up to 20 call attempts were made for nonrefusal cases. Cases in noncontact status were attempted up to 21 times.

  • If an initial refusal was received, a second letter was sent by first class mail, also containing a $2 cash incentive, and after a waiting period of 13 days, TRC interviewers attempted to convert the refusal. Due to time constraints, late in the TRC data collection period, the refusal hold period was shortened to 6 days.

  • If a second refusal was received, a letter was sent via FedEx to bring the household’s attention to the study and its importance. TRC interviewers again attempted to contact the household and secure their cooperation.


The specific incentive amounts for the telephone collection protocol were determined based on the results of an experiment conducted in NHES:2003 that examined 10 different combinations of mailing strategies and incentive amounts (Brick et al. 2006). Those with the highest response rates were approaches that included (1) no advance incentive and a $5 refusal conversion incentive and (2) a $2 advance incentive and a $2 refusal conversion incentive. The selected approach (i.e., approach 2) took into account the benefits of incentives at both the initial and refusal stages.


Table 1 shows the average numbers of telephone contact attempts made for address sample cases, by the TRC result status. The mean number of telephone contact attempts was much higher for households finalized on the second refusal (12 contact attempts) than for those finalized on the third refusal (6 contact attempts). It may be that two-refusal households were more difficult to contact and that is why they reached the maximum call limit before completing the Screener or giving a third refusal. However, caution must be exercised when interpreting these data because means based on small numbers of observations can be highly skewed, and only two bias study cases finalized in the TRC on the second refusal and were subsequently sent for in-person follow-up whereas six cases finalized in the TRC after the third refusal.


Table 1. Mean number of contact attempts in the Telephone Research Center prior to in-person data collection, by telephone result: 2006


Telephone result

Mean number of
telephone contact attempts



Complete

6.63

No answer/ answering machine

21.50

Not Working/non-residential

4.11

Two telephone refusals, did not reach maximum call limit

12.00

Three telephone refusals, did not reach maximum call limit

6.17

Maximum number of calls with refusal(s)*

14.25

Maximum number of calls, no refusal(s)

19.71

* These cases finalized in the TRC as having received the maximum number of calls, but before doing so, these households had refused once or twice.

SOURCE: U.S. Department of Education, National Center for Education Statistics, School Readiness Survey of the National Household Education Surveys Program (NHES), 2007, Parent and Family Involvement in Education Survey of NHES, 2007, and Adult Education for Work-Related Reasons Survey of the National Household Education Surveys Program, 2007.



Following the telephone collection period for the address sample, three types of address sample cases were assigned to in-person data collection. These were cases for which a telephone match was not found (30 cases) or the number that was matched to the address was incorrect or nonworking (33 cases), cases that had received 20 call attempts without completion and had never refused (22 cases, including maximum call, no answer, and answering machine results),1 and non-hostile refusals that were not converted by telephone interviewers (25 cases). About half of the refusal cases (13) had given 3 non-hostile refusals, having completed the protocol described above. However, 12 refusal cases had not given 3 refusals by the beginning of the field collection period on April 29: 5 cases that had reached the maximum number of calls (20) had 1 refusal; 5 cases that had reached the maximum call limit had 2 refusals; and 2 cases with 2 refusals had not yet reached the maximum call limit of 20.


In addition to the nonmatch and nonresponse cases noted above, an additional eight cases that had been completed in the TRC were assigned to in-person interviewers. These were cases in which the address given by the respondent did not exactly match the sampled address, but it was not clear that they were mismatches. Field interviewers were assigned these cases in order to ascertain whether the interview had been conducted at the sampled address.


All telephone calls to the address sample cases ended on April 24, at which time cases were prepared for the April 29 field staff training. In-person field work began on April 30 and continued through June 14. Thus, the in-person field work for the bias study portion of the phase one field test partially overlapped the telephone interviewing for phase two of the field test, which began on May 26 and continued through July 2.


Field interviewers visited each address assigned to them and attempted to secure the household’s cooperation. An incentive of $20 in cash was offered to the households, to be paid upon completion of the Screener. If a household with a TRC result of refusal or maximum call status refused in-person, the case was finalized. If a household that had not been contacted by telephone refused in-person, a letter was sent to the household by FedEx and a field interviewer attempted to convert the refusal. One additional refusal resulted in the household being assigned a final field result of refusal.


When the cooperation of a household was obtained, the field interviewer connected the household to the TRC using the study-provided cellular telephone or the household landline telephone. The interviews were conducted by the TRC using the same CATI system that was used for outbound TRC interview calls. Telephone appointments or in-person callbacks were used to complete extended interviews with household members who were not available at the time the Screener was completed.


If the Screener was not completed after ten in-person contact attempts (field maximum call cases), the interviewer left a prepaid postcard and $5 bill at the household. The postcard included a few brief questions about the household for use in bias analysis.



Interviewer Debriefing


Meetings were held for both phase one and phase two to obtain the comments of the TRC interviewing staff. Specifically, interviewers were asked to identify problem areas in the interviews, to report issues with wording, and to assess respondent comprehension. For the most part, the interviewers are skilled in conducting interviews such as the NHES surveys and reported that most items in the surveys worked well. However, they raised a number of issues about the survey instruments, including redundancy, areas of sensitivity, and operation of the CATI system. The debriefing meetings were attended by Westat’s project director and survey managers, two NCES staff members, and staff from the American Institutes for Research (AIR) who support NCES in the NHES program. The interviewers’ comments were considered along with the observations of Westat, NCES, and AIR staff who monitored interviews during TRC data collection, and are discussed in the Findings and Study Revisions section.



Data Review


Following the completion of phase two interviewing, the survey data were extracted from the CATI system for the completed SR, PFI, and AEWR interviews. Survey managers and a data manager reviewed frequencies and crosstabulations. The purposes of this procedure were to identify items with relatively high proportions of item refusals or don’t know responses, to identify any items with low variability in responses, to review “other, specify” responses to assess the need for additional response categories, and to check the operation of skip patterns within and across interviews. Implications of this review process for instrument changes are discussed in connection with field test findings in the next chapter.



Completed Interviews


The target number of completed phase one field test interviews was 50 interviews each for the SR, PFI, and AEWR surveys. During the phase one field test, 55 SR interviews, 64 PFI interviews, and 74 AEWR interviews were completed. These figures include interviews completed with RDD sample telephone numbers and those completed with address sample cases.


Ninety seven phase one address cases that were not completed in the TRC were attempted in the field, including 67 cases attempted in the TRC and 30 address cases that were not attempted in the TRC because they were not matched to a telephone number (table 2). Forty nine cases were completed as a result of in-person efforts. Of those that were completed, 17 were cases that were not matched to a telephone number; 4 were cases finalized as maximum call cases in the TRC; 17 were cases that were finalized as no answer, answering machine, not working or nonresidential2 in the TRC; 4 were cases that refused 3 times in the TRC, 1 was a case that refused twice in the TRC (and did not reach the maximum call limit), and 6 were cases that refused up to 2 times but finalized with maximum number of calls in the TRC.


Eight additional address sample cases that were completed in the TRC were also sent to the field. These were cases identified in the TRC as potential address-telephone number mismatches. That is, the address in the CATI did not exactly match the address reported by the household member on the telephone. These cases were sent to the field to verify that the family or individual called by the TRC lived at the address matched to the telephone number. Of the eight cases sent for verification, five cases were completed and all of these were verified as correct; one address was found to be vacant, one refused, and the other was not completed despite 10 attempts.


At the close of the bias study field period, self-addressed, stamped postcards and a $5 bill were left at eight households that had reached the maximum number of in-person calls.3 The postcards were intended to get basic information about the nonresponding cases and requested that a household member complete the postcard, seal it, and drop it in the mail. The information requested included the total number of household members, the number of household members under 18, whether the residence was rented or owned, and the highest level of education completed by anyone in the household. None of the postcards were returned to Westat.



Table 2. Final in-person follow-up results by final telephone research center result code: 2006.



Final result code in the telephone research center


Field result

Address not matched to any telephone number

Completed with address-telephone number mismatches

Maximum number of calls

No answer/ answering machine

Non- working/
non-residential
3

Three tele-phone refusals

Two tele-phone refusals

Maximum number of calls with refusal (s)

Total











Complete

17

5

4

8

9

4

1

6

54

Language

Problem

1








1

Maximum Call1

4

1

1

1

3




10

Vacant

2

1

2

1

3

1


1

11

Refusal

5

1

1

3

5

8

1

3

27

Other2

1



1





2

Total

30

8

8

14

20

13

2

10

105

1Cases finalized with a field result of maximum call were those at which 10 in-person attempts to complete the Screener had been made.

2Two cases finalized in the field as “other.” After problems reaching the TRC, one case agreed to complete the interview if the TRC called back, but the TRC was unable to reach the respondent. In the second case, a house-sitter reported that the family would return after the field period closed.

3 Nonresidential status refers to the final status of the telephone number matched to the sampled address, at the end of the telephone interviewing period. These cases were assigned to in-person data collection to ascertain whether the address was residential, and if so, to complete the survey.

SOURCE: U.S. Department of Education, National Center for Education Statistics, School Readiness Survey of the National Household Education Surveys Program (NHES), 2007, Parent and Family Involvement in Education Survey of NHES, 2007, and Adult Education for Work-Related Reasons Survey of the National Household Education Surveys Program, 2007.



Table 3 shows the average numbers of in-person attempts made to complete address sample cases, by the final field result status. While the maximum call rule was 10 visits, some maximum call cases have more than that number. This is because an interviewer would make a contact attempt at a nonresponding household if they were in the same neighborhood for another case. Also, a supervisor would sometimes review a nonresponse case and recommend a contact attempt at a specific time.


Table 3. Mean number of in-person contact attempts by final field result status: 2006


Field Result

Mean number of in-person
visits to finalize



Complete

4.28

Language problem

1.00

Maximum call

11.67

Vacant

4.82

Refusal

2.59

Other

3.50

SOURCE: U.S. Department of Education, National Center for Education Statistics, School Readiness Survey of the National Household Education Surveys Program (NHES), 2007, Parent and Family Involvement in Education Survey of NHES, 2007, and Adult Education for Work-Related Reasons Survey of the National Household Education Surveys Program, 2007.


In phase two, the goal was to complete 200 RDD interviews each for SR, PFI, and AEWR. The Screener initial cooperation rate (complete/complete + refusal) was lower than expected (32 percent), as were the initial cooperation rates for the topical surveys: SR, 80 percent; PFI, 77 percent; AEWR participants, 71 percent; and AEWR nonparticipants, 66 percent. It is not unusual for field tests to experience lower initial cooperation rates than main studies. This is likely associated with the fact that the field test training program is shorter than the main study training program, which will include more discussion of and practice in gaining cooperation. Also, interviewers may accept refusals more easily from cases they know to be part of a field test. The low phase two cooperation rates negatively affected the completion of extended interviews. While the target was met and exceeded for the PFI survey, targets for SR (a rare population), and AEWR (which tends to have a lower initial cooperation rate at the extended interview level) were not met. As noted above, mild initial Screener refusals were released for refusal conversion attempts to increase the yield of extended interviews. Upon consultation with NCES, interviewing was closed on July 2. The final numbers of completed phase two interviews were as follows: SR, 154; PFI, 253; and AEWR, 167.


This page is intentionally blank.

Findings and Study Revisions


Interview Administration Times


Screener Administration Time. The average field test administration time for the RDD Screener was 5 minutes. This is longer than expected during the full-scale data collection, because the full-scale data collection will include households in which no person is enumerated, resulting in a lower average administration time, whereas the field test did not include the no-enumeration condition. Based on prior NHES administrations, we expect that the actual screener administration time for the main study will be about 3.5 minutes. The extended screener administered to bias study cases was 9 minutes in length.


SR and PFI Interview Administration Times. Based on a small number of timings conducted prior to the field test, the length of the SR and PFI surveys was of concern; the phase one field test bore out that concern. As shown in table 4, the average administration time for the SR survey was 30 minutes, and the average for the PFI survey was 36 minutes. Some differences in administration time were noted for the various PFI survey paths. As in previous NHES administrations, the homeschool path had a shorter administration time (27 minutes) than paths for interviews about children enrolled in school. The average administration times were 35 minutes for interviews about elementary students, 36 minutes for interviews about middle school students, and 41 minutes for interviews about high school students.



Table 4.  Interview administration times for the School Readiness (SR) and Parent and Family Involvement in Education (PFI) interviews in phase one of the field test: 2006


Path

N

Minimum

Mean

Maximum

All SR/PFI

99

7

33

75

SR (preschoolers)

51

16

30

66

PFI

48

7

36

75

Elementary

25

7

35

75

Middle school

13

25

36

44

High school

7

29

41

67

Homeschool

3

16

27

35

SOURCE: U.S. Department of Education, National Center for Education Statistics, School Readiness Survey of the National Household Education Surveys Program (NHES), 2007, and Parent and Family Involvement in Education Survey of NHES, 2007.


Following the phase one field test, substantial reductions in the SR/PFI interview were made in order to reduce the administration time. This was done following consultation with the members of the Technical Review Panel. The changes to the instrument were effective in reducing the administration times, as shown in table 5. The average time for the SR interviews was reduced to 20 minutes and the average for PFI interviews was reduced to 28 minutes.



Table 5.  Interview administration times for the School Readiness (SR) and Parent and Family Involvement in Education (PFI) interviews in phase two of the field test: 2006


Path

N

Minimum

Mean

Maximum

All SR/PFI

376

14

25

61

SR (preschoolers)

142

14

20

49

PFI

234

12

28

61

Elementary

128

15

27

57

Middle school

57

18

29

61

High school

45

16

29

44

Homeschool

4

12

21

28

SOURCE: U.S. Department of Education, National Center for Education Statistics, School Readiness Survey of the National Household Education Surveys Program (NHES), 2007, and Parent and Family Involvement in Education Survey of NHES, 2007.



Some SR items are of interest for children in kindergarten, first grade, and second grade in addition to preschoolers. The PFI interviews about these early elementary students include both SR and PFI items. The completion times for children in these grades are shown in table 6. The means are comparable to those for the elementary school path overall.



Table 6.  Interview administration times for the Parent and Family Involvement in Education (PFI) interview in phase two of the field test, for children in kindergarten through second grade: 2006


Grade

N

Minimum

Mean

Maximum

Kindergarten

29

16

26

36

First grade

28

15

26

39

Second grade

20

17

26

35

SOURCE: U.S. Department of Education, National Center for Education Statistics, Parent and Family Involvement in Education Survey of the National Household Education Surveys Program (NHES), 2007.



The administration time for PFI interviews is related to whether a given interview is for the first or only child sampled in a household or for a second child. This is because, in most households, the children have the same parents, and items concerning the mother, father, and household are not asked again in the interview about the second child. In households with both SR and PFI interviews, the SR interview is nearly always done first.4 Table 7 shows the mean administration times for PFI interviews in phase two of the field test, by whether the interview was the first (or only) or second sampled child in the household. As expected, average administration times are longer for the first sampled child and shorter for the second sampled child; the difference is about 7 minutes.



Table 7.  Interview administration times for the Parent and Family Involvement in Education (PFI) interview, by first or second interview in the household, in phase two of the field test: 2006



Path


N

First or only SR/
PFI interview


N

Second SR/
PFI interview

PFI

190

29

44

22

Elementary

91

29

37

22

Middle school

55

29

2

23

High school

42

29

2

22

Homeschool

2

24

2

17

Not applicable.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Parent and Family Involvement in Education Survey of the National Household Education Surveys Program (NHES), 2007.



Adult Education for Work-Related Reasons Interview Administration Time. The average field test administration time for the AEWR interview in phase one was 21 minutes (table 8). The average administration time was 29 minutes for AE participants and 10 minutes for nonparticipants. For those who were sampled as participants and completed as nonparticipants, the average time was 14 minutes; for those sampled as nonparticipants and completed as participants, the average was 22 minutes. Based on the expected number of participants and nonparticipants noted in the NHES:2007 sample design plan, the estimated administration time was 19 minutes.


Table 8.  Interview administration times for the Adult Education for Work-Related Reasons interview in phase one of the field test: 2006


Path

N

Minimum

Mean

Maximum

All AEWR

66

6

21

53

Adult education participant

34

12

29

53

Adult education nonparticipant

20

6

10

15

Sampled participant, completed as nonparticipant


9


7


14


19

Sampled as nonparticipant, completed as participant


3


17


22


27

SOURCE: U.S. Department of Education, National Center for Education Statistics, Adult Education for Work-Related Reasons Survey of the National Household Education Surveys Program, 2007.



The AEWR survey administration times in phase two are given in table 9. The phase two administration times are shorter than those for phase one for each path. Few changes affecting interview length were make following phase one, although the distance education item was reduced in length. It is possible that more training and experience in conducting more interviews resulted in greater interviewer facility in administering the survey. Based on the expected number of participants and nonparticipants noted in the NHES:2007 sample design plan, the estimated administration time is 17 minutes, which was also the mean time for AEWR-NHES:2003.



Table 9.  Interview administration times for the Adult Education for Work-Related Reasons interview in phase two of the field test: 2006


Path

N

Minimum

Mean

Maximum

All AEWR

167

5

17

53

Adult Participants

69

10

23

53

Adult Nonparticipants

64

5

10

20

Sampled participant, completed as nonparticipant

15

6

11

19

Sampled as nonparticipant, completed as participant

19

12

20

35

SOURCE: U.S. Department of Education, National Center for Education Statistics, Adult Education for Work-Related Reasons Survey of the National Household Education Surveys Program, 2007.



Field Test Instrument Evaluation Results


The field test indicated that, in general, the items in all interviews were working very well, and only minor changes were required to clarify items or to improve item wording. To a large extent, this result reflects the fact that most of the survey items had been tested and administered in prior NHES collections. In addition, the NHES:2007 questionnaires had undergone expert review and cognitive testing.


After each phase of the field test, interviewer debriefing meetings were held in order to identify problem areas in the interviews, to report issues with wording, and to assess respondent comprehension. As noted earlier, the interviewers are skilled in conducting interviews such as the NHES surveys and reported that most items in the surveys worked well. However, they raised a number of issues about the survey instruments. For example, interviewers commented on redundancy in both the SR survey (concerning questions about preschool and child care that follow the school enrollment questions) and in AEWR (in which data are collected for up to four sampled courses). Interviewers also noted the lengthy wording of some items, such as those concerning tutoring (in PFI) and distance education (in AEWR). Interviewers reported difficulties in using some features of the CATI system, including the school look-up utility and some text fields. Many of their comments are discussed in the sections that follow, and some of those comments led to changes in the instruments.


In addition to issues raised by interviewers or observed by staff monitoring the interviews, the length of the SR and PFI interviews was of concern. Prior to making decisions about items to be deleted from the SR and PFI interviews, the members of the Technical Review Panel (TRP) were consulted. Following notification by electronic mail, each TRP member was sent a FedEx package containing the field test phase one version of the SR/PFI interview. The panel members were asked to assign priority to questionnaire items, classifying them as essential, very important, or less important. Comments were collected by prepaid return FedEx and tabulated prior to a working meeting with NCES and AIR staff to make decisions regarding the deletion of SR/PFI items.


The following sections discuss the findings concerning the survey instruments and changes made to the instruments, including the deletions made to reduce the SR/PFI interview administration time.



Screener


Phase One Screener Findings


Interviewers reported that the collection of names is sensitive and asked whether household members could be enumerated without names. The collection of names greatly facilitates asking questions about household members in both the Screener and in the extended interviews. In addition, having the names of household members facilitates asking for appropriate respondents on later calls to the household (if needed) to complete the extended interviews. No questionnaire change was made.


Interviewers reported that the questions about other telephone numbers in the household seemed out of place to them and to some respondents. As a result, additional wording was added to the introduction to these items, explaining the reason for collecting this information.


The Screener contains a number of items administered only to the address sample, so that additional information about the households will be available if one or more extended interviews in the household are not completed. In the phase one field test, the expanded version of the Screener was administered to all address sample cases. This has been changed so that, in the main study, the expanded version will be administered only to cases referred for in-person data collection.


Among the items in the expanded Screener is a question about work-related courses that defines the activities for the respondent. This definition was observed to be tedious when read for a second or subsequent household member. The item was modified so that the full definition will be read the first time the item is administered, and a shorter version will be administered when questions are asked about other adults in the household.


Interviewers remarked on the delay between the end of the screener and the beginning of extended interviews. At the end of the Screener, the CATI system performs a number of operations. These include creating the extended interviews and populating the HHREVIEW screen that interviewers use to access interviews, copying selected Screener data into the extended interviews, and closing the Screener. Interviewers were experiencing a delay of several seconds at this stage. As a result, a new final screen was created for the Screener. This screen provides text for the interviewer to read about the extended interview(s) to be done in the household (for example, telling the respondent we would like to speak with her about a given child, or telling the respondent we would like to speak with another household member about a child or about his or her adult education activities).



Phase Two Screener Findings


Interviewers reported that some respondents did not understand the purpose of the NHES survey call and were resistant to participating. They asked whether some advance notice could be provided. In the main study, advance letters on Department of Education stationery and a $2 incentive will be mailed to all cases for which an address match is available. In addition, the main study training will include additional sessions on gaining respondent cooperation.


As in phase one, interviewers commented on the sensitivity of collecting names. No changes were made to the collection of names, for the reasons noted above.


One interviewer said that the Screener item asking if any household members were age 20 or younger was problematic because household members were enumerated even when the answer was “no.” It was explained to interviewers that, in the main study, adults will not be enumerated in many households without children or youth who are potentially eligible for SR or PFI. No questionnaire change was made.


The additional phrase explaining the reason for asking about other telephone numbers in the household, added after phase one, was perceived by some interviewers to promote respondent understanding of why the questions are asked. Other interviewers, however, felt that some respondents still found the questions strange, and asked if the questions could be moved to extended interviews. These items are asked in the Screener because they are needed for all households; they were not placed in the extended interviews because they would lengthen those interviews. No further change was made.



AEWR Survey


Phase One AEWR Findings


Most items in the AEWR interview had been administered in NHES:2003, and in many cases, other past AE surveys. Interviewers reported few difficulties with administering the items, but commented on the length of some items and the redundancy of questions about work-related courses; these are addressed below. A relatively small number of changes were needed for the AEWR interview, most of which were minor wording adjustments for clarification purposes. This section enumerates the changes that were made.


For the College or University Degree or Certificate Programs section (ACU), the word “your” was replaced with the word “the” for ACU1320 (CRCRDHR) and for ACU1400-ACU1402 (CRCOMPMM & CRCOMPYY) following phase one of the field test. These replacements were made for consistency with the way the majority of questions are worded in this section.


For the Vocational or Technical Diploma, Degree, or Certificate Programs section (AVT), the word “your” was replaced with the word “the” for AVT1280-AVT1285 (VOFREQ & VOUNIT) and for AVT1360-AVT1370 (VOCOMPMM & VOCOMPYY) following phase one of the field test. These replacements were made for consistency with the way the majority of questions are worded in this section.


Work-Related Training or Courses Section (AWR), Introduction. Interviewers remarked that the introduction to the Work-Related Training or Courses section was lengthy and the subsequent probe redundant. The introduction provides important information, however, and reducing its length could lead to the inclusion of inappropriate courses or the exclusion of courses. In addition, changing this introduction could affect the estimates of participation, preventing the analysis of trends, which is a major goal of the survey.


Interviewers noted that they found some text fields difficult to work with because of their small size (this pertains to the size of the field visible on the CATI screen rather than the number of characters allowed). For the Work-Related Training or Courses section (AWR), the field lengths for the course names (AWR1140 - CRNAME(N)) and subjects (AWR1150 - CRSUBJ(N)) were increased for ease of viewing at the request of the interviewers following phase one of the field test. The word “the” was added before (TRAINING NAME1-4) for AWR1480 (WRCEU(N)), AWR1500 (WRPROVE(N)), AWR1520 (WRWRKPL(N)), AWR1540 (WRWRKHR(N)), AWR1560 (WREMPAI(N)), AWR1580 (WREMPTU(N)), AWR1600 (WREMPMA(N)), and AWR1620 (WRSLFPA(N)) following phase one of the field test. This was added for grammatical correctness in the reading of these questions.


For the Work-Related Less Formal Learning Activities section (AIL), “Attending” was changed to “Attended” for AIL1108 (ILBBAG) and AIL1110 (ILCONF) following phase one of the field test. This was done to make these questions past tense, consistent with the way that the other questions in this series are worded. For question AIL1140 (ILCERT) the text “read journals, publications, or magazines” was added, to be displayed only if the respondent gives a “Yes” response for AIL1112 (ILREAD) alone; this change was made following phase one.


Distance Learning section (ADL). Interviewers noted that the definition of distance learning was very lengthy, and this was also observed by those monitoring interviews. Some also found the last sentence confusing. Based on these observations, the text “Using technology in a class with an instructor present is not considered to be distance education” was removed from ADL1100 (DISEDINTRO) following phase one of the field test. Following phase two, interviewers reported that the shorter definition worked well and the absence of this sentence did not lead to respondent confusion about what activities were to be included, nor was such confusion observed in monitoring field test interviews.


For ADL1120-ADL1380 (DEVIDCD1-DEOTH2), the CATI program was adjusted following phase one so that the programs or courses taken within the past 12 months, regardless of whether they were reported as having been for work-related reasons or non work-related reasons, were read to the respondents. This approach is easier for respondents and also allows for comparison with NHES:2005 (which was not limited to work-related activities).


For the Remaining Background section (ARB), a question and other-specify response option were added following phase one (ARB1257 – IBLANG and ARB1259 - IBLANGOS) to capture information on the first language (or languages) that the respondent learned to speak; this change was made after phase one of the field test. This was added to promote consistency between the SR/PFI and AEWR surveys regarding the language items. The variable ARB1260 (IBSPEAK), was expanded upon (two additional response options) to account for reporting in ARB1257 and ARB1259. The variable ARB1280 (ASPWRK), was expanded upon (four additional response options) to account for reporting in either ARB1257 and ARB1259 or ARB1260 and ARB1265.


Interviewers noted that they received an edit error message at ARB1600 (CUREMPYR), years at current employer, when the adult’s age was missing. The range edit for this item was based on the respondent’s age, and no provision had been made for missing age. Following phase one of the field test, a soft check was added with a maximum value of 20, to be applied in cases in which the respondent’s age had not been collected. This provides a range check, but allows entry of a larger value if the respondent confirms that a larger number of years is correct.



Phase Two AEWR Findings


As noted above, most AEWR items were administered in past surveys and overall the instrument was found to be working well. Few changes resulted from the phase one field test. Phase two offered the opportunity to test the instrument with a larger number of adults. Again, the instrument was found to be working very well, and few issues were identified.


An interviewer remarked on the sampling of elderly persons for AEWR and stated that some respondents did not believe the survey applied to them. It was explained to the interviewers that the NHES adult education surveys have no upper age limit, and that we are interested in learning throughout life.


Interviewers again remarked on the redundancy of the work-related section for those adults who take large numbers of courses (as noted above for phase one). This redundancy results from asking the same questions for up to four work-related courses. Over the years, considerable thought and work has gone into the work-related courses section, and while it is redundant, this is necessary to capture individual course-level information. The approach used for employer support questions (which asks, for example, about tuition support for course 1, course 2, course 3, etc., in one item) is not workable for some other course characteristics, notably items concerning reasons for and outcomes of participation. No change was made to the manner of collecting information about courses.


A CATI consideration regarding work-related courses was identified during the monitoring of interviews. While an interviewer was asking about a respondent’s second course, the course name display was observed to shift back to the name of the first course. This is not an error in the CATI program itself. Rather, it reflects that up to four courses are collected in an array, and the interviewer returned to the first course using the up arrow key rather than the left arrow key to return to a previous question about the same course. The main study training program will include specific instructions on this in the AEWR scripts and interviewers will practice backing up in the course section to ensure that they understand how to do so correctly.


Outcomes of Participation. An interviewer suggested adding a response option of “Made me a better person.” Since this is not a work-related outcome, it was not added to the item.


Distance Learning (ADL). Following phase one of the field test, the definition of distance education was shortened, as discussed above. A concern was that removing this explanation might lead to respondent confusion. This was not observed in monitoring, and interviewers reported that they did not experience problems resulting from shortening this definition. Therefore, no further change was made to the item.


Informal Learning. Section AIL includes a series of questions about whether informal learning activities were intended to teach some specific skills such as basic reading, writing, and math; oral communication; and so on. Review of the “other, specify” response to this series suggested the addition of two new categories: management of self and others, and working with technologies. These were added to the AEWR interview. Also, one question was changed from decision making or time management to decision making or problem solving. Finally, additional clarifying text was added to the item on interpersonal skills.


Respondent Language (IBLANG/IBSPEAK). Following phase one of the field test, an item was added that asks the first language the sampled adult learned to speak (IBLANG). This was added to improve the correspondence in questions about the selected adult and the questions about the parents of sampled children, and allows the questions about the person’s language to be skipped in subsequent interviews. The item drives displays in the following questions about the adult’s current main language (IBSPEAK) and language spoken most at work (ASPWRK). The CATI displays were found to be working incorrectly in phase two and have been fixed.



SR and PFI Surveys


Phase One SR and PFI Findings


Interviewers stated that the SR/PFI interview is too long, and noted that respondents reacted negatively when the estimated survey administration time was read to them. The majority of changes to the SR/PFI interview involved deletions due to the lengthy survey administration times. A significant set of changes to the SR survey involved changing the survey skip patterns so that many items concerning parent and family involvement in education that were administered to the parents of preschoolers in the phase one instrument would be skipped. Other deletions were made throughout the survey, as described below. In some cases, items were combined; for example, separate items on various types of college savings accounts were consolidated. Some items were revised or added in response to requests from the Department of Education. In addition, wording changes were made to clarify some items.


PHS1120 (HSWHO), Person Homeschooling Child. The words “in your household” were deleted from the question. Response 91 was changed from “Other” to “Other Person.” Response options 13 and 14 for “other adult” and “brother/sister” were deleted. Persons outside the household are included in the response options. Therefore, irrespective of household size and membership, all response categories for this question will be displayed.


PHS1160 (HSALSO), Other Household Members Homeschool Child. This item was deleted.


PHS1180 (HSHHMPART/B), Relationship of Other Household Members Homeschooling Child. This item was deleted.


PHS1190 (HSHHMPARTOS/B), Other Specify Item, Relationship of Household Member Homeschooling Child. This item was deleted.


PHS1220 (HSDAYS), Days/Week Child Homeschooled. A minor wording change was made to indicate the days per week a child is homeschooled.


PHS1280 (HSHOURS), Total Hours/Week Child Homeschooled. A minor change was made to ask the total hours per week that the child is homeschooled.


PHS1380 (HSKACTIV), Participation of Child in Activities with Other Homeschooled Children. The earlier version of this question asked if the homeschooling association had activities for children who were homeschooled. The question now generally asks if the child participated in activities with other homeschooled children.


PHS1390 (HSPACTIV), Homeschooling Association - Joint Activities for Parents and Children. This item was deleted.


PHS1720 (HSINTNET), Homeschooling Courses over the Internet. This item (previously HSCORR) originally asked whether correspondence courses had been used in homeschooling. That item was replaced with a question asking if the homeschooled child takes courses over the internet.


PHS1730 (HSINTPUB), Courses Taught by a Public School over the Internet. This is a new question that asks if the type of instruction in (PHS1720) (i.e. over the internet) was provided by the child’s public school. Specifically, for those who respond “yes” to PHS1720, this question asks if the instruction is provided by a public school. Questions representing variables HSCORR, HSWWW and HSTVVID were replaced with a single question on instruction via the internet (HSINTNET). An additional question PHS1730 (HSINTPUB) was added to obtain information on whether this instruction was provided by the public school. This question is asked because some public schools provide internet or distance instruction, and the Department of Education would like to measure this.


PHS1740 (HSWWW), Homeschooling Course by Internet, E-mail or World Wide Web. This item was deleted.


PHS1760 (HSTVVID), Homeschooling Instruction by Television, Video or Radio. This item was deleted.


PCC1100, Introduction, Daycare Centers and Early Childhood Programs that Child Attended. During interview monitoring, it was observed that some respondents gave a positive response at this item when their child was in family day care rather than attending a center. As a result the last sentence in this introduction was revised to indicate that the question includes regular care that occurs in centers, however, it does not include care in a private home. Some respondents who reported earlier in the interview that their preschooler was enrolled and attending preschool found the PCC series redundant. Interviewers will be trained to handle this through confirmation, acknowledging information the respondent has given previously.


PCC1160 (CPHBNOW), Child’s Attendance in Home-Based Head Start or a Home-Based Preschool Program. This question was deleted.


PCC1180 (CPHOMNOW), Child in a Home-Based Program. This item was deleted.


PCC1260 (CPNEVER), Child Ever Attend Daycare Center, Preschool, Prekindergarten or Head Start Program. The wording of the question changed so that “daycare center” now appears at the end of the question. This change was made to emphasize the word “center” and prevent the misreporting of home-based family day care that was observed in phase one.


PCC1420 (CPVISIT), Times Respondent entered Daycare Center or Preschool to Talk with Staff/Other Parents. The wording was changed to specify the number of times the respondent or any adult in the household had gone to meetings, participated in activities, or volunteered at the child’s daycare or preschool program. The wording was revised to match item PFS1720 which deals with parent and family involvement in the child’s school.


PCC1480 (CPHDST), Daycare Center or Home-Based Preschool a Head Start Program. This item was deleted.


PDC1300 (DPSOUND), PDC1320 (DPBUTTON), PDC1360 (DPWRITE), PDC1400 (DPATTN) and PDC1440 (DPSTORY), Developmental characteristics. These questions asked if preschoolers could sound out words, button their clothes, mostly write or scribble, could pay attention well, and could tell a story to an adult. All of these items were deleted both because of concerns about survey administration time and because of measurement concerns. Although important school readiness outcomes, several items were deleted from the SR/PFI questionnaire because it was felt that parents would find it hard to respond accurately to these questions and because some TRP members recommend using a subset of the measures in the phase one questionnaire. PDC1320 (DPBUTTON) was deleted because of possible duplication with PRP1260 (RPDRESS).


PKG1420 (KPCONCRN), Concerns About Child’s Readiness to Start Kindergarten. This item was deleted.


PKG1600 (KPDECID), Whether Respondent Contributed to Decision to Delay Entry into Kindergarten. This item was deleted.


PSC1300 (STLKPAR), Talk with Other Parents about Schools Their Children Attend in Deciding Between Schools. A minor wording change shortened this question to ask if they consulted other parents about the schools their children attend.


School Look-Up Utility. The PFI survey includes a school look-up utility that identifies the child’s specific school. A question for the field test was whether parents would resist naming their child’s school; this did not turn out to be the case. Interviewers reported some difficulties with the use of the school utility. Specifically some reported time delays in searching the school list. Some reported functional problems entering refused, don’t know, or not found. Some found it time consuming to locate schools within a state. And some indicated that when they used the backup key and then entered a school name, schools in various states with matching names would appear. Many of the problems could be alleviated by additional training and practice. Others are being investigated with the CATI programming staff. Further training will be provided for the main study, perhaps with an exercise in which interviewers are given a list of school to locate.


PSC1584 (SCHARTER), PSC1588 (SRELGON), and PSC1590 (SCATHLIC), Whether School is a Charter School, Affiliated with a Religion or a Catholic School. These questions are redundant if the school is found in the CATI look-up file, since the information is available from other NCES data sources. When the school is identified in the look-up file, these will be skipped. If the school is not found in the CATI look-up file, these questions are now asked after PSC1580 (SCHLSTAT). These questions were previously PSC1200 (SCHARTER), PSC1460 (SRELGON) and PSC1480 (SCATHLIC).


PSC1700 (SSCHEDUL) and PSC1720 (SSCHEDOS), School Schedule is Traditional or Year-Round. These items were deleted.


PSC1820 (SBEFAFT), After-School Program Run by School or another Organization. This item was deleted.


PSC1840 (SATTBFAF), Child Attends After-School Program on a Regular/Drop-In Basis. This item was deleted.


PSE1620 (SESCHWRK) and PSE1660 (SEDOWELL), Number of Times Child’s Teacher/School Contacted You about Problems with School Work or Things the Child is Doing Better. These questions were shortened to ask the number of times the teacher or school contacted the respondent about problems with school work or about things the child did better.


PSE1760 (SEADPLC), Enrollment in Advanced Placement Classes. The variable was changed from SEHONOR to SEADPLC. This question initially asked whether the child was enrolled in honors, gifted or talented classes. The item now asks if the student is enrolled in advanced placement courses and it is only asked about students in high school.


PSE1800 (SEMAGNET), Enrollment in a Magnet Program or School. This item was deleted.  In an earlier version of the questionnaire, PSE1800 (SEMAGNET) covered magnet school enrollment of all children in public school. It was felt that this question, if asked, was more appropriate for children in higher grades only. It was decided that SEMAGNET could be dropped and that obtaining information on enrollment in advanced placement classes in PSE1760 would be sufficient. Furthermore, this new question SEADPLC (PSE1760) would be asked only of children in senior high school (Grades 9-12).


PSE1840 (SEESL), Enrollment in English as a Second Language/Bilingual Education/English Immersion. A minor change was made to delete “any of the following programs” from the question. Also, “other English immersion program” is now “an English immersion program.”


PSE2200 (KPWHO) and PSE2210 (KPWHOS), Person Who First Suggested that Child Repeat Kindergarten/First Grade. These two questions were deleted.


PSE2220 (KPDECRT), Parent/Guardian Input about Child Repeating Kindergarten Or First Grade. This item was deleted.


PSE2260 (KPRPTRSN) and PSE2280 (KPRPTROS), Most Important Reason Child Repeated Kindergarten or First Grade. This item was deleted.


PSE2400 (Intro) – PSE2600 (SESCHOL). Questions on expectations, child’s future education, and parents’ planning for college expenses were restricted to children in grades 6 and above or those homeschooled with a grade equivalent of grade 6 or higher and age 12 and over.


PSE2520 (SECOLACT), Account to Save for Child’s College Education. The question now asks whether the family opened any account to save for the child’s college education.


PFS (Family Involvement in School). This section is now only asked of parents of children in grades kindergarten through twelfth grade. Parents of preschoolers are no longer asked any items in this section. The word “preschool” has been deleted from all items in this section.


PFS1140 (FSMTNG) – PFS1680 (FSCOUNSLR), Family’s Attendance at School Meetings, Conferences, School Events, Fundraising and Volunteering. Because these items will no longer be asked of parents of preschoolers, references to “preschool” and “since September” were deleted. The stem of this series of questions, “Since the beginning of this school year, (have/has) (you/any adult in your household…), will be read only once by the interviewer before PFS1140.


PFS1220 (FSPTMTNG), Attendance at a parent-teacher organization. The words “or association” were added to this item, to capture the range of organizations in which family members may participate.


PFS1860 (FSCONCRN), Has parent contacted school about something that concerned them? This item was deleted.


PSP (School Practices to Involve and Support Families). This section about school practices to involve and support families will no longer be asked of parents of children in center-based programs or preschool. The word “preschool” has been deleted from all questions in this section. This change was made to reduce survey administration time. In addition, interviewers remarked on the repetitiveness of some items in this section, and some reductions were made to address this (e.g., FSSPPERF).


PSP1120 (FSNOTES) - PSP1220 (FSPHONEP), Questions Involving School Contact, (sending family notes/e-mails, newsletters or calling on the phone). The responses for “1-2 times” and “3 or more times” have been deleted. PSP1140 (FSNOTEP), PSP1180 (FSMEMOP) and PSP1220 (FSPHONEP) were deleted.


PSP1320 (FSSPCDEV), How Well School Helps Parent Understand What Children at Child’s Age Are Like. This item was deleted.


PSP1340 (FSTRANS), How Well School Helps with Information/Services About Transition to Next Level of Schooling. This item was deleted.


PSP1360 (FSSPVOLN), How Well School Makes Family Aware of Chances to Volunteer. This item was deleted.


PSP1460 (FSSPSERV), How Well School Provides Information about Community Services. This item was deleted.


PSP1560 (FSSPWORK), How Well School Provides Information about Planning for Work After Completing Education. This item was deleted.


PIS (Involvement in School Decision Making). This section about involvement in school decision making will no longer be asked of parents of preschoolers attending preschool or early childhood programs. The word “preschool” has been deleted from all items in this section.


PIS1180 (FEPLCMNT), Parental Involvement in Placement of Child in Particular Classes. This item was deleted. The wording of this question “have a say in decisions” was found to be unclear to respondents. Furthermore, it represents the parents’ perspective only and may not reflect on whether parents are actually able to influence decisions. Therefore, this question on their perceptions was dropped in place of other questions on actual actions such as whether they participated in decision-making groups (FSCOMMIT:PIS1140) or whether they ever requested that the child get a particular teacher (FEPARTIC: PIS1240).


PFP (Factors Affecting Parent and Family Participation in School and Parent Support of the School). This section involves factors that affect parent and family participation in school and parent support for the school. This section will no longer be asked of parents of preschoolers in early childhood programs. Any references to “preschoolers” have been deleted from this section.


PFP1200 (FSCCONT), Staff Person to Contact if Child has a School-Related Problem. This item was deleted.


PFP1240 (FPPRUHW), Parents’ Responsibility to Teach Children Value of Education. This item was deleted.


PFP1260 (FPPRSUP), Parents’ Responsibility to Support Teacher. This item was deleted.


PFP1440 (FPTALK), Contact with School/Teacher if Parent Disagrees with School. The wording was changed to “…Do you ever contact (his/her) school or teacher?”


PFP1460 (FPPRINC), When Disagreement Occurs with School Do You Contact Principal? This item was deleted.


PFP1480 (FPFAMILY), When Disagreement Occurs with School Do You Contact Family Members? This item was deleted.


PFP1500 (FPCHILD), When Disagreement Occurs with School Do You Tell Child About Disagreement? This item was deleted.


PSW1260 (FHUSEPL), Child’s Use of Place for Homework. This item was deleted.


PSW1540 (FHSIBH), Siblings Help with Child’s Homework. This item was deleted.


PSW1580 (FHHHADLH) is now PSW1580 (FHOTHHH), Anyone Else in Household Who Helped Child with Homework. A minor change in the wording was made. Also, only those household members who are the child’s age or older and are not the parents, but can be a second mother or father, will be administered PSW1580 and PSW1600.


PSW1620 (FHCHFRND), Child’s Friend Helped with Child’s Homework. This item was deleted.


Tutoring Series. The PFI survey included a series of items about tutoring services. Interviewers and those monitoring interviews noted that the wording of these items was very lengthy. Following the field test, these items were revised as specified by the Department of Education to meet agency information needs. Changes are described in detail below.


PSW1790 (FHSCHTUT), Information from School/District About Free Tutoring. The wording in this question changed to the following:


Some schools and districts help students get free tutoring or extra academic help outside of regular school hours. This extra help can be offered after school, on weekends, or during the summer.


Since the beginning of the school year, have you received information from (CHILD)’s school or district about opportunities for free tutoring?”


PSW1800 (FHGETTUT), Free Tutoring Outside of School Hours. The wording of this question was changed to the following: “During this school year has (CHILD) received free tutoring outside of regular school hours by a provider approved by your state and district?”


PSW1810 (FHTUTSAT), Satisfaction with Tutoring Services. The wording in this question changed. This item now asks about satisfaction with tutoring services that the child received.


PSW1820 (FHOTHTUT), Child Receiving Any Other Tutoring. The wording changed to ask if the child received any other tutoring during this school year.


PSW1830 (FHPDTSAT), Satisfaction with Other Tutoring Services. The wording of this question changed and now asks about the overall satisfaction with other tutoring services.


PSW1840 (FHTUCOST), Cost Household Pays for Child’s Tutoring. This question was added.


PSW1850 (FHTUUNIT), Unit of Payment for Child’s Tutoring. This question was added.


PSW1860 (FHTUUNOS), Other Unit of Payment for Child’s Tutoring. This is a new response option, added to capture unanticipated units of payment for tutoring services.


PHA1600 (FONEXT), Reading to Child – Ask What Will Happen Next? This question was deleted.


PHA1700 (HAASKRD), Child Asks to Be Read To. This item was deleted.


PHA2040 (FOCHORE), Parent Involving Child in Household Chores. This item was deleted.


PHA2720 (FOTSCHL) through PHA2840 (FOTWORK), How Often Parent Talks to Child about Various Subjects. These items were deleted.


PHA2870, Introductory Statement about Child’s Television Viewing. This item is new, and was added to inform the respondents about the topic of the next several questions.


PHA2880 (TVHRWK) and PHA2885 (TVHRWKNUM), Time Child Spends Watching Television Per Week. These are new questions, added at the request of the Department of Education. PHA2880 and PHA2885 are asked of a random half sample of respondents. The remaining half sample will receive questions PHA2900 through PHA2925, described below. This approach was taken to assess differences in reporting using the two approaches (weekly versus weekday/weekend day).


PHA2900 (TVHRWKDY) and PHA2905 (TVHRWKDYNUM), Time Child Watches Television on a Weekday, and PHA2920 (TVHRWKND) and PHA2925 (TVHRWKNDNUM), Time Child Spends Watching Television on a Weekend Day. These are new questions, added at the request of the Department of Education. PHA2900 – PHA2925 are asked of a random half sample of respondents.


PHA2940 (FOCOMPHM), Home Computer That Child Uses. This item was deleted.


PHA2960 (TVCHNL), Television Networks/Channels Child Watches Each Week. This is a new item, added at the request of the Department of Education.


PHA2980 (TVCHNLOS), Other Television Networks/Channels Child Watches Each Week. This is a new question, added at the request of the Department of Education.


PHA3100 (TVCHNL1), Channel Child Watches Most Often. This is a new item, added at the request of the Department of Education.


Computer and Internet Use. Interviewers reported that they and respondents found this series of items repetitive. The deletions noted below, made primarily to reduce survey administration time, addressed this concern.


PHA3120 (CMPTOGET), How Often Parent/Guardian Uses Computer with Child. This item was deleted.


PHA3140 (FOINTHM), Internet Access at Home. This item was moved to section PHH, and appears as PHH1130.


PHA3200 (NETPARN), How Often Parent uses Internet with Child. This item was deleted.


PHA3280 (FOSCHACT), Child’s Participation in School Activities. An introductory statement was added to this question, to focus the respondent on the topic to follow.


PHA3400 (FOCHURCH) is now (FORELCLS), Child’s Participation in Church/ Temple Youth Group or Religious Classes. The word “instruction” was changed to “classes.”


PHA3460 (FOEDUC), Participation in extra classes or tutoring. This item was deleted because it is redundant with the new tutoring items.


PHA3800 (FORCOMP), Computer, Internet and Video Games. A minor change in wording was made. This question is now asked to all respondents since previous questions on home access to a computer/internet are no longer asked before this question.


PRP1140 (RPALPHA) – PRP1240 (RPDISCP), Importance of Doing Various Things Before Child Enters Kindergarten. Response option 4, “not very important” was deleted. “Not at all important” is now coded “4.”


PRP1260 (RPDRESS), Importance of Showing Child How to Dress Before Kindergarten. This item was deleted.


PRP1280 (RPSTAND), Importance of Showing Child How to Stand Up for Himself/Herself Before Kindergarten. This item was deleted.


PCS1120 (CSPARCMT), Number of Parents Talked to Regularly. Interviewers noted that the “other parents” that respondents speak with in the school and the community are often the same parents. As noted below, the separate item (PSC1200) was deleted.


PCS1200 (CSPARSCH), Number of Parents Talked to Regularly in Person Or on Phone. This item was deleted.


PHD1300 (HDCHINS), Child Covered by Health Insurance. The wording changed from “Does child have” to “Is child covered by health insurance?” The revised wording is more precise.


PHD1460 (HDBLIND), Was previously HDBLNDIM. Blindness or Another Visual Impairment. The item is now “Blindness or another visual impairment not corrected with glasses?” The additional wording was added based on research conducted for the ECLS that shows visual impairments are over reported by parents when the follow-up about correction through glasses is not asked.


PFG1340 (MOMLANG) and PFG1360 (MOMLANOS), First Language Mother Learned to Speak. If the subject of section PFG completed an AEWR interview, the response to this question can be copied from IBLANG in the AEWR data to PFG1340 during post-survey processing..


PMG1220 (DADLANG) and PMG1240 (DADLANOS), First Language Father Learned to Speak. Similarly, if the subject of section PMG completed an AEWR interview, the response to the this question can be copied from IBLANG into PMG1220 during post-survey processing.


PNR1220 (NRLIVAR1), Who Child Lived with Most During School Year. This item was deleted.


PNR1240 (NRLIVOS1), Other Response to PNR1220. This item was deleted.


PNR1260 (NRLIVEV1), Time Since Child’s Mother Lived in Same Household with Child. This item was deleted.


PNR1280 (NRLIVNU1), Number Given of How Long Since Child’s Mother Lived in Same Household with Child. This item was deleted.


PNR1300 (NRLIVUN1), Unit of How Long Since Child’s Mother Lived in Same Household with Child. This item was deleted.


PNR1440 (NRSAW1), How Long Since Child’s Mother Last Saw Him/Her. The following response was added: “DOES NOT WANT TO ANSWER QUESTIONS”. Another response category “DECEASED” was also added. These additions were made so that the remaining items in the section can be skipped under these conditions.


PNR1860 (NRPHONY1), Number of Times Child Has Talked to Mother on the Phone in Last Year. This item was deleted.


PNR1900 (NRLETTY1), Number of Times Child Has Gotten a Letter or Email in the Last Year from Her. This item was deleted.


PNR1960 (NRPERY1), Number of Times in Last Year Since Saw Mother in Person. This item was deleted.


PNR2040 (NRLSTCO1), How Long Since Child Last Had Any Contact with Mother. This item was deleted.


PNR2060 (NRLSTNU1), Number Indicating How Long Since Child Had Any Contact with Mother. This item was deleted.


PNR2100 (NRLSTUN1), Unit Indicating How Long Since Child Had Any Contact with Mother. This item was deleted.


PNR2320 (NRLIVAR2), Who Child Lived with Most During School Year. This item was deleted.


PNR2340 (NRLIVOS2), Other Response to PNR2320. This item was deleted.


PNR2360 (NRLIVEV2), How Long Since Child’s Father Lived in Same Household with Child. This item was deleted.


PNR2380 (NRLIVNU2), Number Indicating How Long Since Child’s Father Lived in Same Household with Child. This item was deleted.


PNR2400 (NRLIVUN2), Unit Indicating How Long Since Child’s Father Lived in Same Household with Child. This item was deleted.


PNR2520 (NRSAW2), How Long Since Child’s Father Last Saw Him/Her. The response, “DOES NOT WANT TO ANSWER QUESTIONS” was added. Another response category “DECEASED” was also added. These additions were made so that the remaining items in the section can be skipped under these conditions.


PNR2930 (NRPHONY2), Number of Times Child Has Talked to Father on the Phone in the Past Year. This item was deleted.


PNR2940 (NRLETTY2), Number of Times Child Has Gotten a Letter or Email from Father in Past Year. This item was deleted.


PNR3000 (NRPERY2), Number of Times Child Has Seen Father in Person in Last Year. This item was deleted.


PNR3080 (NRLSTCO2), How Long Since Child Last Had Any Contact with Their Father. This item was deleted.


PNR3100 (NRLSTNU2), Number of Times Since Child Last Had Any Contact with Their Father. This item was deleted.


PNR3120 (NRLSTUN2), Unit Indicating How Long Since Child Last Had Any Contact with Their Father. This item was deleted.


PHH1130 (FOINTHM), Access to Internet at Home. This new question was moved from the home activities section (PHA) to the household section (PHH).


Phase Two SR and PFI Findings


Additional changes were made to the SR/PFI questionnaire after phase two of the field test. Further deletions were made to further reduce the survey administration time. In addition, comments from interviewers received during the interviewer debriefing and issues observed by those monitoring interviews were taken into account in determining final instrument changes.


As in phase one, interviewers reported that the SR/PFI interviews were long and that some respondents reacted negatively to the time estimate given at the start of the interview, or did not want to continue and complete an interview for a second child after completing the first interview. Interviewers also informed us about confusion regarding multiple child care arrangements and center vs. family daycare in one case. Other questions where interviewers had problems with the SR/PFI interview included an accurate estimation of the count of books in the child’s household, and the number of parents in the school or community that the child’s parents spoke to on a regular basis. In both instances, they informed us that the range of the respondents’ response was quite large perhaps requiring the provision of respondent categories.


The following are specific issues identified in phase two and the instrument changes:


Child’s grade in school (SGRADE and SGRADEQ in the Screener, GRADE and GRADEEQ in the SR/PFI interview. While the codes entered by interviewers for grades 1 through 12 match the child’s grade, the numbers for grades below first grade are not logically connected to the specific grades (93 for preschool, 94 for transitional kindergarten, 95 for kindergarten, and 96 for prefirst grade). Some problems with interviewer error were reported in the field test. As a result, a CATI confirmation screen was added after each of these four items when a grade below first grade was entered. This confirmation screen displays the grade entered and instructs the interviewer to correct it if necessary.


[YOU ENTERED (DISPLAY CODE AND DESCRIPTION, FOR EXAMPLE, 94-TRANSITIONAL KINDERGARTEN.) pRESS ENTER TO CONFIRM OR BACK UP AND CORRECT.]


PHS1120 (HSWHO/B) Person who mainly homeschools child. The interviewer instruction [DISPLAY APPROPRIATE CATEGORIES BASED ON HH MEMBERSHIP] was deleted. Responses will be based on all persons including non-household members. Therefore, all response categories will be displayed irrespective of household size and membership, as follows.


MOTHER 10

FATHER 11

GRANDPARENT 12

BROTHER/SISTER 13

OTHER PERSON 91

SPECIFY_________________________

REFUSED -7

DON’T KNOW -8


PHS1580 (HSCOTH). Other Sources of curriculum or books used to homeschool child. Since the list of sources in PHS1400-1560 is quite comprehensive, this item was deleted.


PCC1140 (CPNNOW). Whether child attends a daycare center, preschool, prekindergarten, or Head Start program. As noted above, interviewers remarked that this item is redundant when the child has already been reported as being enrolled. An instruction to interviewers to confirm enrollment was added.


PCC1260 (CPNEVER). Child ever attended preschool, prekindergarten, Head Start program or daycare center. A skip error was observed in which those not currently enrolled were receiving inappropriate questions about current care. This was corrected.


PKG1360 (KPENROLL). When child can start school. The first response category was changed to “WHEN OLD ENOUGH/BASED ON BIRTH DATE,” to reflect the ways in which parents respond to the question.


PSC1230. Skip box for school choice items. This skip box was removed, and item PSC1240 (whether the school district allows school choice) will be asked of parents whether or not their child is in a chosen school and whether or not the school is their assigned district. This will provide more complete information about school choice. A new skip box after PSC1260 dictates that the question about the main reason for choosing a school is not asked if the child is in an assigned school and the parents did not consider other schools.


PSC1260 (Parent considered other schools)/PSC1360 (Main reason for choosing school). The skip pattern was changed so that parents will be asked about their main reason for choosing the child’s school whether or not they considered other schools. This will provide more complete information about school choice.


PSC1360 (SREASON). Main reason to choose the school child attends. Based on the distribution of the “Other Specify” variable (SREASNOS), the response “SIZE OF SCHOOL” has been changed to “SIZE OF SCHOOL/CLASS.” Two new responses were also added: RELIGIOUS AND OTHER SPIRITUAL REASONS (10) and COST AND FINANCIAL REASONS (11).


Skip Box PSC1400. A new skip box, PSC1400, was added. This excludes private school students from responding to SNEIGHBR, whether the family moved to the neighborhood so that the child would be eligible to attend his/her school. The purpose of the question is to ascertain whether families move because of public school attendance areas.


School Look-Up Utility. As in phase one, interviewers reported some problems in using the utility. Additional training and exercises will be provided for the main study.


Skip Box PSC1582. The path was changed to include appropriate skip patterns for cases in which the school was not found in the look-up file and the public/private status of the school is unknown. Previously, such cases were not receiving the question regarding whether the school was religious.


PSE1120 (Skip box for PSE1180, Child enjoys school). This skip was changed so that PSE1180 will be asked about preschoolers and children in kindergarten through second grade.


PSE1140 – PSE1220 (SECHALNG –SEEASY). Child and family’s experience in school. Due to time constraints, PSE1140 (SECHALNG), PSE1200 (PSERESPCT) and PSE1220 (SEEASY) have been deleted. Only PSE1180 (SEENJOY) will be asked only about preschoolers and those in grades K-2. As a result, the skip pattern in box PSE1120 has been changed accordingly.


PSE1320 (SEGRADES). Child’s grades during the school year. The wording of the question was changed; the phrasing “the school (he/she) attends this year” was deleted.


PFS1400 – 1440. (FSVOLCLS, FSVOLOTH). Volunteer in child’s classroom or school. The items PFS1400-1440 (FSVOLOTH) were dropped. A new variable PFS1400 (FSVOL) was created which reads, “Served as a volunteer in (CHILD)’s classroom or elsewhere in the school.”


PFS1180 – 1640 (FSHADMTG – FSHADCOM). Whether school has had various types of meetings and activities for parent and family involvement. Due to time constrains, this series of questions was dropped. As a result the skip box before this series of questions and the skip box PFS1710 were also removed.


PFS1820 (FSVOLHRS). Hours parent has participated in volunteering or fundraising. Due to time constraints, this item was deleted. Therefore, skip box PFS1780 was also removed.


PSP1440 (FSSPHOME). School provides workshops and materials to help child learn at home. This item was deleted.


PIS1120 (FSDECIS). Child’s school includes parents in committees. This item was deleted due to time constraints.


PIS1140 (FSCOMMIT). Any adult participated in decision-making groups. This item was deleted due to time constraints.


PIS1240 (FEPARTIC). Request for particular course or teacher. This item was deleted due to time constraints.


PFP1220–1380. (FPHLPCHD – FPSWELCM). Parent perceptions on their expected role and ability to help child do well in school. The order of questions was changed to FPHLPCHD, FPPTRUST, FPSWELCM, FPPRVAL and FPPRATND. The change in order was made to move from specific to general.


PSW1260 (FORHW). Family rules about homework. This question was moved from section PHA (home activities) to section PSW (school work), to follow a series of questions on homework.


PSW1460–1600. (FHMOMH – FHNHADLH). Person who helped child with homework. Due to time constraints, these items were deleted and associated skips were removed.


PSW1700 (FHHELP). How many days per week child was helped with homework. The wording in the question was changed. Since PSW1460-1600 were deleted, the phrase “…or any of the people we just mentioned” was changed to “or does anyone in your household”. The response option, “Never” (0) was also added.


PSW1790 (FHSCHTUT). Whether information on free tutoring was received. To ensure that information on free tutoring before the current school year is also obtained, the phrase “Since the beginning of the school year” was deleted. However, the word “current” was added to the question to refer to the child’s current school.


PSW1800 (FHGETTUT). Free tutoring received by child. The phrase “state and district” is now “state or district”.


PHA1920 (FOCRAFT1). Arts and crafts with child. The wording was changed to match PHA2000 (FOCRAFT2): “Did arts and crafts, for example, coloring, painting, pasting, or using clay?” This was the wording used in the PFI 2003 interview as well.


PHA2320 (FORELIG) and PHA2340 (FOCOMMUN). Family participation in events. These two questions have now been combined. PHA2320 is now FOGROUP to match the wording used in PFI 1996 and 1999: “Attended an event sponsored by a community, religious, or ethnic group?”


PHA2660 (FODINNER). Number of times had dinner together. Since this question refers to the past week, it has been moved to PHA2170 before a series of questions on activities in the past month. As a result, the skip box PHA2110 has been marginally changed. The last statement in the skip box PHA2110 now reads “Else go to PHA2170” instead of PHA2180.


PHA2860 (Skip box for television viewing items). The skip was changed so that items concerning television viewing will be asked about preschoolers and children in kindergarten through second grade.


PHA2880 – 2885. (TVHRWK-TVHRWKNUM). Time spent watching TV in a typical week. These items were deleted as questions on TV watching will be asked separately for weekdays and weekends. As a result, skip box PHA2940 has been corrected with references to PHA2885 now removed.


PHA2920 (TVHRWKND). Time watching TV on a weekend day. The term “typical weekend day” in this question was changed to “typical day in the weekend” to make it more clear to respondents.


PHA2960-3100. (TVCHNL-TVCHNL1). TV channels that child watches. Based on the distribution of the “Other specify” variable, TVCHNLOS, the response categories in TVCHNL and TVCHNLOS were modified. The word “KIDS” was added after DISCOVERY CHANNEL. Similarly, “PBS KIDS” was added after PBS SPROUT. An interviewer note that “A maximum of 17 responses can be given” was also added to PHA2960.


PHA3160 (FORTVPRG). Rules about watching TV. This question on rules on watching TV was moved from PHA3660 to this location where it follows a series of questions on TV watching. This question is addressed only to preschoolers and children enrolled in kindergarten through second grade.


Introduction to PHA3380-3580 (FOMUSLES-FOARTS). The wording in the introduction statement was changed. The phrase “preschool” was removed since this series of questions is not addressed to preschoolers.


PHA3640 – 3800 (FORTVCOM-FORCOMP). Family rules. Due to time constraints, PHA3640 (FORBED), PHA3740 (FORNIGHT), and PHA3800 (FORCOMP) were dropped. PHA3660 (FORTVPRG) is now PHA3160. PHA3700 (FORHW) is now PSW1260.


PRP1240. (RPDISCP). Disciplining child. The wording of this question has changed. It now reads: Discipline your child when (he/she) is misbehaving?


PCS1100 Introduction to PCS1120. Since PCS does not deal with community organizations, classes and support services, the wording in this introduction has now changed and reads: Now I’d like to talk with you about contact with other parents.


PCS1120 (CSPARCMT). Number of other parents child’s parents communicates with. Based on interviewer reports that some parents had difficulty arriving at a specific number, response categories were added to this question rather than keeping the response as a continuous variable. The Phrase “Would you say…” was added at the end of this question.


The following responses were added to PSC1120:


None…………………………………………………… 0

One to three other parents……………………………... 1

Four to five other parents……………………………… 2

Six to ten other parents, or..…………………………… 3

More than 10 other parents……………………………. 4

REFUSED…………………………………………….. -7

DON’T KNOW………….……………………………. -8


PHD1320 (HDDELAY). Child diagnosed with developmental delay. This item was dropped; pervasive developmental disorder (PDD) is included in the list of disabilities collected later in this section.


Skip box PHD1820. The reference to PHD1720 in the skip box was removed since it was deleted earlier and does not exist anymore.


Section PNR. Non-residential parents. This section has been dropped due to time constraints.


PHH1240 (HNEIGHB). Conditions in neighborhood. This item was deleted.



Bias Study Findings


There were two main goals of the bias study field test. The first goal was to test the protocol for training interviewers, obtaining cooperation, making contact with the TRC with a willing participant, and assessing the approach used to monitor progress in the field. No problems with the bias study approach were identified, and the full-scale bias study is expected to yield sufficient numbers of cases for analyses of potential nonresponse biases in the survey data. However, as noted earlier, none of the households at which a nonrespondent postcard was left actually returned the postcard to Westat.5 Because the number of cases was small, this result does not suggest that the postcard approach should be abandoned.


The second goal of the bias study field test was to evaluate respondent reactions to the in-person approach. This was in response to concerns that sampled households might perceive in-person efforts as harassment following telephone refusals. Over the course of the field test of the bias study there were two cases with very strong refusals and one hostile refusal. Of those three cases, two refused three times by telephone and one refused twice by telephone. The hostile refuser acknowledged the telephone calls and yelled at the respondent to leave the property. One of the strong refusals was via condominium intercom. No follow-up calls or letters were received from respondents regarding in-person efforts. Field staff training for the main study will include gaining cooperation and the handling of refusals, including hostile refusals.


Reference



Brick, J.M., Hagedorn, M.C., Montaquila, J., Brock Roth, S., and Chapman, C. (2006). Impact of Monetary Incentives and Mailing Procedures: An Experiment in a Federally Sponsored Telephone Survey (NCES 2006-066). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

1 In a few cases, the maximum call limit was not reached because the household had one or more appointments scheduled during telephone collection, which resulted in the case being “on hold” for the appointment for a period of time.

2 The nonresidential status here refers the telephone survey result: that is, the telephone number that was matched to the sampled address was designated as nonresidential. These cases were assigned to in-person followup to ascertain whether the sampled address was residential, and if so, to complete the survey with the household.

3 No postcards were left at two cases shown on the maximum call line in table 2. One case had been completed in the TRC with a similar, but not identical address, and was sent to the field only for verification and not for data collection. The other case had given one field refusal and was in field refusal status when it became a field maximum call case, and no postcard was left.

4 A PFI interview may be conducted before an SR interview in the rare case in which two sampled children have different parent/guardian respondents, and the respondent for the older child is available before the respondent for the younger child. This did not occur in the field test.

5 Interviewers used their judgment as to where to leave a postcard based on the configuration of the dwelling unit. Some were left under doormats with a corner of the card showing, some inside screen doors, etc. It is not legal for field staff to place postcards in mailboxes.

0







File Typeapplication/msword
AuthorWAITS_T
Last Modified ByDoED
File Modified2006-09-18
File Created2006-09-18

© 2024 OMB.report | Privacy Policy