Report: Previous Mailed Screener Experiment

ATT-D2-NSFG_MailedScreenerExperiment.docx

National Survey of Family Growth

Report: Previous Mailed Screener Experiment

OMB: 0920-0314

Document [docx]
Download: docx | pdf

NSFG OMB Attachment D2 OMB No. 0920-0314




FROM: James Wagner and Wen Chang

TO: Anjani Chandra, Joyce Abma

LAST UPDATED: 09/21/2020

SUBJECT: Final Results of the Mailed Screening Survey Experiments

ISR recently completed two quarters (Q31-Q32) of an experiment that tested the costs and response rates associated with a mailed screening survey. These two quarters of experimentation build upon experiments that were conducted in four previous quarters (Q24-Q27). Overall, these experiments tested whether mailed screeners maintain the quality of the data and increase efficiency, since a response to a mailed screener means the interviewer does not have to go to the household to conduct the screener. However, given the uncertainties, including what would be the rate of returned screeners, we proposed an experiment to evaluate whether this leads to reduced costs and proves operationally feasible, while maintaining screener and main response rates, in comparison to the current protocol of personal visits to the household for each screener.

We focused the experiment on a subgroup of the sample – a stratum of housing units predicted to be likely ineligible, that is, largely composed of households with persons older than the NSFG age-eligible group. This stratum had an eligibility rate of about 17% and averaged about 700 cases per quarter. This group of older individuals was expected to be more likely to respond to a mailed survey, and less likely to be eligible (and, therefore, less likely to require personal visits).

In the initial experiments, conducted from Q24-Q27, the cases in the low eligibility stratum were randomized to three arms: 1) mailed screener with $0 prepaid incentive, 2) mailed screener with $2 prepaid incentive, and 3) control. Arms 1 and 2 were each assigned about 150 cases per quarter, and the balance (about 400 cases per quarter) was allocated to the control arm. The sample was a systematic selection from the low eligibility stratum across cases sorted by PSU, SSU, and walking order within SSU.

In the first three quarters of the experiment (Q24-Q26), the mailed screener was sent using a USPS Priority mail envelope along with a cover letter, a study brochure, and a prepaid return envelope. After two quarters (Q24-Q25) the experimental approaches proved to be less cost effective than expected. Therefore, we modified the protocol in Q26 and Q27. In Q26, we released the mailed screener two weeks before the field period began, and returned the sample to the interviewing staff at the beginning of the field period to continue screening. In Q27, we used a standard mailer – a less expensive regular, large envelope -- to test cost efficiency of the approach. The standard mailer cost about 20% of what the priority mailer had cost. The results from Q27 were positive, but not conclusive due to small samples sizes.

Further, as reported in previous memoranda, there was an operational error in Q27. The no-incentive mailed screener treatment was assigned to sample in small MSAs while the incentive condition was disproportionately assigned to cases in large MSAs or non-MSA counties. Therefore, in Q27, the incentive and urbanicity were largely confounded.

In our summary of those experiments, we reported the overall results, i.e., combined across the four quarters. We also reported costs separately from Q26 and Q27 that demonstrate potential savings using the Q27 approach. For completeness, we include those results here as well. Table 1 shows the cumulative Q24-Q27 response rates and yield by treatment.



Table 1. Q24-Q27 Response Rates, Eligibility and Yield by Treatment Condition

 

Mail+$2

Mail

Control

Sample Size

589

588

1536

Mail Return Rate

18.0%

10.9%

NA

Unweighted Eligibility Rate

20.0%

16.5%

19.6%

Unweighted Screening Response Rate

96.4%

97.2%

97.6%

Unweighted Main Response Rate

78.7%

77.8%

81.0%

Weighted Screening Response Rate

93.3%

94.1%

94.4%

Weighted Main Response Rate

63.4%

61.3%

72.6%

Completed Interviews

59

49

158



We did find lower response rates in the mailed condition. We think this is partly explained by the shortened field period (for quarters other than Q26). Two weeks of the 12-week face-to-face field period were lost to the mailed survey field period. Related to these response rates are the costs associated with each treatment. Table 2 presents the detailed costs by treatment. Although the “Mail +$2” treatment did reduce screening costs, the main interviewing was more expensive on these cases. This led that treatment to have higher costs than the control group. The highest costs were observed in the mailed screener arm that did not include an incentive.

Table 2. Detailed Costs by Treatment, Q24-Q27

 

Mail+$2

Mail

Control

Incentive for Sample Size = 700

$1,400.00

$0.00

$0.00

Addl Mailing Materials for Sample Size = 700

$350.00

$350.00

$0.00

Implementation of Mailing for Sample Size = 700

$2,121.00

$2,121.00

$0.00

Mailer for Sample Size = 700

$4,655.00

$4,655.00

$0.00

Percentage of Sampling Resulting in Completed Main Iw

10.0%

8.3%

10.3%

Mailing Costs per Completed Main Iw

$121.59

$122.16

$0.00

Face-to-Face Screening Costs per Completed Screening Iw

$39.47

$41.60

$52.15

Face-to-Face Screening Costs per Completed Main Iw

$323.78

$412.64

$410.61

Main Interviewing Costs per Completed Iw (main only)

$176.95

$179.54

$152.77

Total Costs per Completed Main Iw

$622.32

$714.34

$563.38

Having observed these results in Q24 and Q25, we decided to modify the protocol for Q26 and Q27. The response rates observed during Q26 are presented in Table 3. We found that mailing screeners two weeks earlier did help address the lower main interview rate, although with only one quarter of data, any conclusions need to be tentative.

Table 3. Q26 (early mailing) Response Rates, Eligibility and Yield by Treatment Condition

 

Mail+$2

Mail

Control

Sample Size

150

150

353

Mail Return Rate

20.0%

8.7%

NA

Unweighted Eligibility Rate

22.5%

17.6%

19.7%

Unweighted Screening Response Rate

97.7%

98.3%

98.6%

Unweighted Main Response Rate

77.8%

85.0%

82.2%

Weighted Screening Response Rate

94.5%

96.2%

97.7%

Weighted Main Response Rate

63.0%

80.2%

72.6%

Completed Interviews

21

17

37


Table 4 shows the results from Q27. Here the response rates looked similar. It was difficult to predict costs given the operational issue with the incentive randomization, described above. However, we did present some results that attempt to model the impact of incentives using the observed data. This analysis (presented in a previous memorandum) suggested support for the hypothesis that the standard mailer with a $2 incentive was cost-effective, thus led to the proposal to conduct two more quarters of experimentation.


Table 4. Q27 (standard mailer) Response Rates, Eligibility and Yield by Treatment Condition

 

Mail+$2

Mail

Control

Sample Size

150

150

300

Mail Return Rate

13.3%

10.7%

NA

Unweighted Eligibility Rate

16.5%

20.0%

14.9%

Unweighted Screening Response Rate

96.8%

97.7%

98.3%

Unweighted Main Response Rate

75.0%

75.0%

69.2%

Weighted Screening Response Rate

92.8%

94.1%

97.0%

Weighted Main Response Rate

63.5%

54.9%

60.5%

Completed Interviews

12

15

18



It was decided to conduct two more quarters of experimentation with the protocol deployed in Q27 (i.e. the standard large envelope). We also decided that since the balance of the evidence indicated that the $2 incentive was cost effective, we would offer a single experimental arm. This allowed us to accumulate 600 cases in two quarters. The response rates for the two groups (“Standard Mailed +$2” and “Control”) are presented in Table 5. The weighted main response rate is somewhat smaller in the experimental condition (standard mailer with a $2 incentive). However, this difference is not statistically significant.


Table 5. Response Rates, Eligibility and Yield by Treatment Condition (Q31 and Q32 Combined)

 

Standard Mailer+$2

Control

Sample Size

600

1,075

Mail Return Rate

18.0%

NA

Unweighted Eligibility Rate

17.3%

13.80%

Unweighted Screening Response Rate

96.4%

95.80%

Unweighted Main Response Rate

81.7%

82.20%

Weighted Screening Response Rate

93.7%

93.2%

Weighted Main Response Rate

71.3%

77.9%

Completed Interviews

58

83



Table 6 examines the detailed costs of the treatment and control conditions. We find that with the observed mail return rate (Table 5) and the screening and main interview response rates in each condition, the mailed screener did produce cost savings.



Table 6. Estimated Detailed Costs (Q31 & Q32 Cumulative) 

 

Standard Mailer

Control

 

Mail+$2

Incentive for Sample Size = 700

$ 1400.00

NA

Addl Mailing Materials for Sample Size = 700

$ 350.00

NA

Implementation of Mailing for Sample Size = 700

$ 2121.28

NA

Mailer for Sample Size = 700

$ 1085.00

NA

Percent of Sampling Resulting in Completed Main Iw 

9.7%

7.7%

Mailing Costs per Completed Iw

$ 73.22

$ 0.00

FtF Screening Costs per Completed Screening Iw

$ 36.49

$ 41.92

FtF Screening Costs per Completed Main Iw

$ 322.73

$ 464.14

Main Interviewing Costs per Completed Iw (including main only)

$ 177.48

$ 164.94

Total Costs per Completed Iw 

$ 573.43

$ 629.08



Given the experimentation conducted in Q31 and Q32, we would recommend adopting a mailed screener, with the other features noted above, for the likely ineligible stratum. It does appear that this design feature will lead to cost savings.

We note that this might lead to lower main interview response rates. We do not have sufficient evidence on this question. However, our focus on the low-eligibility stratum mitigates any damage. As can be seen from the counts in the bottom row of Table 5, there are not many interviews obtained from this group.

It might be worth considering expanding the use of mailed screeners in a number of ways. First, we recommend mailing the screener prior to the face-to-face field period. Although we did not do this during Q31 and Q32, we did it during Q26 and found that it improved phase 1 response rates. This might also lead to an improved main interview rate. Second, we recommend considering expanding the use of a mailed screener for cases in other strata (e.g. middle and high eligibility strata). We would expect that these strata are somewhat less likely to respond to a mail survey. But this is worth testing as a way to see if additional cost savings can be found.

5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJames Wagner
File Modified0000-00-00
File Created2021-10-04

© 2024 OMB.report | Privacy Policy