A TUS Returned Mail Summary and Analysis *Working draft*
Each month the American Time Use Survey (ATUS) sample is drawn from the set of Current Population Survey (CPS) cases that completed their 8th interview two months prior. Using a random stratified design that considers demographic variables such as sex, age, and race, the ATUS sampling scheme randomly selects households and then randomly selects one member of each household to be interviewed. Only after the sample has been created are the cases divided into incentive and non-incentive cases. Cases without a phone number altogether or an obviously incorrect number (all 0’s, 1’s, etc.) are assigned as incentive cases.
The input file review process includes a review by ATUS subject matter staff of the address fields in preparation for mailing. This step is undertaken to improve the likelihood that the advance materials reach the household prior to calls being made. The ‘best’ address is pieced together from the physical address, the mailing address, and the CPS case notes. An initial data file with address information is sent to the National Processing Center (NPC) where it is again reviewed using the United States Postal Service (USPS) Postalsoft software. The Postalsoft program typically identifies about 80 cases per month as having inadequate or undeliverable addresses. These cases are returned to the ATUS staff where they are more intensely scrutinized, corrected if possible, and then returned to the NPC to prepare the monthly mailout.
When a piece of mail sent by the NPC cannot be delivered, the post office returns the item with an explanation of why it was undeliverable. Personnel at the NPC translate the explanation from the post office into a numeric equivalent. For example, the post office handwrites or applies a sticker to an ATUS mail piece that says "Insufficient Address," or "IA." The NPC staff assigns check-in code 05 and checks it into the Automated Tracking and Control System (ATAC). Summary reports from the ATAC are produced and reviewed on a quarterly basis. This analysis uses case specific information from the ATAC system in an effort to identify areas of improvement in the address review process and to explore improvements to case outcome assignment.
Even though the address review requires a significant amount of effort to perform, it has been assumed that the effort has helped to increase the probability that a respondent receives the advance mailing materials and therefore is more likely to complete the survey. This analysis will try to quantify the improvement in response rates due to the address review process.
Another purpose of this investigation is to determine the impact of utilizing the information from the USPS to assign the final outcome code. It may be possible to more accurately assign the final outcome of non-interviews using the additional information provided by the USPS. For instance, a respondent who moved could be assigned a “Not eligible” outcome instead of a “Non-contact” if the returned advance letter was marked “Moved – No forwarding address.” This process would improve the accuracy of the final outcome assignment, which currently relies only on information collected through attempts to contact the respondent by telephone. This would not only be more accurate but could result in a higher calculated response rate for the ATUS.
Another area of interest is to track returned mail rates for incentive cases. Incentive cases are of particular concern because the advance materials are the only way an incentive recipient is aware he or she has been selected for the survey. Anecdotal observations suggest that these cases tend to have irregular and incomplete mailing addresses. The address quality may impede or delay receipt of the advance materials. Focusing efforts during the address review on incentive cases may improve the address quality and could result in a higher response rate for this group.
Data Used in Analysis
2. Analysis
Quantitative Assessment of Address Quality & Address Review Process
The multiple data sources were assembled and used to calculate four metrics of address ‘quality’ as indicated by Table 1.
Table 1 Summary of USPS Returned Mailings
(April 2005 – September 2007 Panels)
|
Total Cases |
Number of Cases from April 2005 – September 2007 |
63119 |
Number of Completed Diaries (001,002) |
31609 |
Percent Completed Diaries |
50.1 |
|
|
Number of Advance Letters Returned |
2703 |
Percentage Returned |
4.28 |
Number of Completed Diaries (001,002) |
397 |
Percent Completed Diaries |
14.69 |
|
|
Number of Postcards Returned |
3074 |
Percentage Returned |
4.87 |
Number of Completed Diaries (001,002) |
604 |
Percent Completed Diaries |
19.65 |
|
|
Number of USPS Address Corrections |
1861 |
Percentage Corrected |
2.95 |
Number of Completed Diaries (001,002) |
179 |
Percent Completed Diaries |
9.62 |
|
|
Number of NPC Postalsoft Rejected Addresses* |
1056 |
Percentage Rejected |
1.67 |
Number of Completed Diaries (001,002) |
473 |
Percent Completed Diaries |
44.79 |
*Data from the NPC review program are available starting with the February 2006 panel.
Address quality is measured by how often the advance letters are returned, how often the postcards are returned, the number of address corrections by the USPS, and the number of addresses rejected by Postalsoft.
The USPS returned the two main ATUS mailings, the advance letter and the postcard about 4-5% of the time. The other two types of ATUS mailings, advance letter remail and refusal conversion letters occur so rarely that they will not be studied further in this analysis. We were interested to discover that the postcards are returned at a higher rate than the advance materials, even though the same address is used. We think perhaps this is because the advance materials are mailed in a larger Priority Mail package, which may improve the deliverability.
In addition to returning the physical materials, the USPS also notates the correct address on the majority of the returned mail, accounting for about 3.0% of the total sample. These frequencies are very similar to those observed for the returned postcards. This particular mail return code is examined more fully in Section 2.2.
The last item depicted by Table 1 is the frequency of ‘rejects’ produced by the internal address validation performed by NPC using the Postalsoft address review program. Information about the Postalsoft rejects was only available for the time period of February 2006 through September 2007. For this time period, the Postalsoft program rejected 1,056, or 2.5%, of 41,997 cases. Looking at the advance letter returns alone, 842 of the 1,056, or 80.0%, were not returned by the USPS. We looked at this another way: for the same time period, there were 1,744 returned advance letters. The Postalsoft program flagged only 214, or 12.3%, of these returned items. The Postalsoft program flags far more cases than are actually returned by the USPS. A rejected address does not necessarily mean that the mailing will not reach the designated person. In fact, around 45.0% of the cases rejected by the Postalsoft review still resulted in a completed interview.
The data were also used to create frequencies of USPS return mailings by the ATUS sampling panel as depicted by Table 2. While exhibiting some variance between sampling panels, the frequencies for each panel appear somewhat consistent over the analysis period.
Table 2 Frequencies of USPS Returned Mailings by Sampling Panel
(April 2005 – September 2007 Panels)
Panel yymm |
Total Cases |
Number Adv LTR Returned |
Percent Adv LTR Returned |
Number Postcards Returned |
Percent Postcards Returned |
Number USPS Corrected |
Percent USPS Corrected |
0504 |
2106 |
102 |
4.8 |
103 |
4.9 |
66 |
3.1 |
0505 |
2115 |
105 |
5.0 |
107 |
5.1 |
77 |
3.6 |
0506 |
2120 |
109 |
5.1 |
133 |
6.3 |
86 |
4.1 |
0507 |
2113 |
113 |
5.3 |
130 |
6.2 |
96 |
4.5 |
0508 |
2098 |
120 |
5.7 |
136 |
6.5 |
70 |
3.3 |
0509 |
2122 |
95 |
4.5 |
109 |
5.1 |
58 |
2.7 |
0510 |
2118 |
96 |
4.5 |
124 |
5.9 |
61 |
2.9 |
0511 |
2100 |
93 |
4.4 |
108 |
5.1 |
63 |
3.0 |
0512 |
2113 |
72 |
3.4 |
130 |
6.2 |
72 |
3.4 |
0601 |
2116 |
55 |
2.6 |
99 |
4.7 |
57 |
2.7 |
0602 |
2117 |
86 |
4.1 |
89 |
4.2 |
57 |
2.7 |
0603 |
2101 |
88 |
4.2 |
115 |
5.5 |
50 |
2.4 |
0604 |
2119 |
70 |
3.3 |
95 |
4.5 |
46 |
2.2 |
0605 |
2099 |
85 |
4.0 |
118 |
5.6 |
60 |
2.9 |
0606 |
2106 |
89 |
4.2 |
97 |
4.6 |
57 |
2.7 |
0607 |
2099 |
102 |
4.9 |
92 |
4.4 |
69 |
3.3 |
0608 |
2118 |
93 |
4.4 |
99 |
4.7 |
67 |
3.2 |
0609 |
2098 |
90 |
4.3 |
107 |
5.1 |
67 |
3.2 |
0610 |
2101 |
87 |
4.1 |
104 |
5.0 |
59 |
2.8 |
0611 |
2088 |
85 |
4.1 |
75 |
3.6 |
50 |
2.4 |
0612 |
2103 |
72 |
3.4 |
65 |
3.1 |
48 |
2.3 |
0701 |
2099 |
74 |
3.5 |
96 |
4.6 |
64 |
3.0 |
0702 |
2099 |
118 |
5.6 |
70 |
3.3 |
63 |
3.0 |
0703 |
2111 |
77 |
3.6 |
60 |
2.8 |
50 |
2.4 |
0704 |
2087 |
80 |
3.8 |
86 |
4.1 |
55 |
2.6 |
0705 |
2091 |
78 |
3.7 |
103 |
4.9 |
74 |
3.2 |
0706 |
2085 |
90 |
4.3 |
105 |
5.0 |
56 |
2.8 |
0707 |
2099 |
112 |
5.3 |
104 |
5.0 |
68 |
2.6 |
0708 |
2080 |
89 |
4.3 |
101 |
4.9 |
63 |
3.0 |
0709 |
2097 |
78 |
3.7 |
114 |
5.4 |
54 |
2.6 |
|
|
|
|
|
|
|
|
Avg. |
2104 |
90 |
4.3 |
102 |
4.9 |
63 |
3.0 |
It was not possible to directly quantify the benefit of the address review process due to multiple changes made during the period of analysis. There were changes in the source CPS data due to the WebCATI conversion as well as changes to the Postalsoft program used by NPC. In addition, there were changes made each month to the subject matter staff review processes and automated review program. Based on the preceding, however, we will conclude that the time-consuming aspects of the address review should be devoted to reviewing Postalsoft rejects and the incentive cases. More discussion of the incentive cases is in Section 2.3.
Conversion of Cases to Ineligible due to Advance Letter Returns
The goal of this section is to assess whether using the returned mail information can be used to assign the case outcome code.
For the purposes of the analysis, we are assuming that the returned mail information is reliable and correct. However, given the (small) percentage of instances where the returned mail code is incompatible with the final case outcome, we acknowledge that the reliability is, at best, unknown. When telephone attempts yield “unknown eligibility” outcomes, however, it’s possible that the information provided by the returned mail code can be used to assign a more accurate final outcome code. Of the 63,119 total advance mailings from April 2005 through September 2007 panel months, 2,703 (4.3%) were returned by the USPS. Of the 2,703, 315 are incentive cases; these will be reviewed in more detail in Section 2.3. There are 32 different reason codes assigned that describe why the mail was returned. As mentioned before, the return mail codes are assigned by the NPC using procedures not specifically designed for the ATUS.
To begin our analysis, we first estimated the maximum impact of reassigning case outcome by using the mail return code. We assumed that all 2,703 returned mail cases, including incentive cases, were in fact not eligible for the ATUS---for instance, movers. Reassigning all these outcomes from their original outcome of non-contact, other, or unknown eligibility to not eligible resulted in an average response rate improvement of 1.22%. This is the maximum increase possible if outcomes were coded according to the mail return information.
In the next part of the analysis, we attempted to identify specific cases that could have the final outcome reassigned based on the mail return code. We subset the 2,703 returned mail cases by reviewing only the non-incentive cases and identifying the most commonly received of the 32 possible mail return codes. Table 3 lists the ten most common mail return code definitions and the distribution of final outcome codes for cases where the advance letter was returned. We selected mail return code 06 and mail return code 08 for further analysis.
Table 3 Summary of Top 10 Returned Mail Codes
(Non-Incentive Cases, April 2005 – September 2007 Panels)
Code |
Description |
Comp+ Partial |
Refusal |
Non- contact |
Other |
Not Eligible |
Unknown |
Total |
8 |
Address correction provided by Post Office |
71 |
92 |
78 |
21 |
399 |
352 |
1013 |
2 |
Attempted Not Known |
64 |
34 |
24 |
16 |
70 |
94 |
302 |
19 |
Not Deliverable As Addressed Unable to Forward |
69 |
65 |
23 |
14 |
44 |
83 |
298 |
12 |
No Mail Receptacle |
52 |
34 |
13 |
13 |
14 |
16 |
142 |
6 |
Moved Left No Address |
9 |
11 |
17 |
1 |
33 |
62 |
133 |
24 |
Temporarily Away |
15 |
10 |
13 |
7 |
8 |
31 |
84 |
15 |
No Such Number |
34 |
17 |
7 |
4 |
4 |
16 |
82 |
5 |
Insufficient Address |
19 |
10 |
5 |
5 |
9 |
6 |
54 |
17 |
No Such Street |
20 |
11 |
7 |
0 |
3 |
3 |
44 |
29 |
Vacant |
4 |
4 |
3 |
2 |
2 |
12 |
27 |
|
|
|
|
|
|
|
|
|
|
TOTAL (All Codes) |
397 |
317 |
218 |
95 |
622 |
739 |
2388 |
|
% TOTAL |
16.6% |
13.3% |
9.1% |
4.0% |
26.1% |
31.0% |
|
To discern whether the final outcome matches the mail return code, we examined the mail return code 06 as an example. Mail return code 06 is defined as “Moved Left No Address.” This code is of interest because it indicates that the addressee no longer lives at that address, and is thereby not eligible for the survey. The final outcomes of the cases that received mail return code 06 are summarized below in Table 4.
Table 4 Summary of Mail Return Code 06
(April 2005 – September 2007 Panels)
|
Number of cases |
% of total |
Action |
Number of Cases |
63119 |
|
|
Number of Returned Advance Letters |
2703 |
4.3% |
|
|
|
|
|
Number Coded as Moved Left No Address (Code 06) |
133 |
|
|
Completed |
9 |
6.8% |
Leave as Complete |
Refused |
11 |
8.3% |
Leave as Refused |
Non-Contact |
17 |
12.8% |
Reassign |
Other |
1 |
0.8% |
Reassign |
Not Eligible |
33 |
24.8% |
Correctly Assigned |
Unknown |
62 |
46.6% |
Reassign |
Total Number to be Reassigned |
80 |
|
|
Average Number per Panel |
2.7 |
|
|
Average Increase in Response Rate |
0.10% |
|
|
As indicated Table 4, the mail return code does not always correspond to the final
outcome assigned by the telephone center. Our expectation, however, was that mail return code 06 would correspond to the Not eligible category, and to the Movers outcome (021.003) in particular. We found that almost half (46.6%) of the cases that received mail return code 06 received a final code of Unknown Eligibility instead of Not Eligible. Only 24.8% of the mail return code 06 cases received a final outcome code of Not Eligible. Of the cases that were Not Eligible, 84.9% were assigned outcome code 021.003 (Mover) by the telephone center, which matches the mail return code. Of the cases coded as Unknown Eligibility, 67.7% were assigned outcome code 022.002 (Unknown eligibility: sample unit not found/unreached). In these cases, the telephone center was unable to determine whether or not the designated person lived at that address. It is possible that some of these cases are true movers, but the current procedure relies on the telephone center to discern this information through phone contact, rather than relying on returned mail.
We next explored reassigning outcomes just for cases that received mail return code 06, and determined the impact on the calculated response rate. For this calculation, we included the incentive cases since we had included them in our earlier recalculated response rate. As indicated in Table 4, 24.8% of the time the final outcome code corresponds to the return mail code, i.e. “Not Eligible”. However, for the remaining cases, the return mail code does not support the final outcome code. Completed and Refused interviews were excluded from reassignment; in other words, if a mail return indicated that the respondent had moved but it was a completed or refused interview, we are deferring to the outcome as assigned by the telephone center and did not reassign the final outcome code. The remaining 80 cases having final outcome codes of Non-Contact, Other, or Unknown Eligibility were reassigned as Not Eligible. This recalculation increases the response rate by an average of 0.1%. This improvement is very modest because there are very few cases assigned mail return code 06.
We then examined the most prevalent return mail code, 08 to further gauge the impact of reassigning outcome by mail return code. Mail return code 08 is defined as “Address correction provided by Post Office” and accounts for 42.4% of the returned mail. The final outcomes of the cases that received mail return code 08 are summarized below in Table 5.
Table 5 Summary of Mail Return Code 08
(April 2005 – September 2007 Panels)
|
Number of cases |
% of total |
Action |
Number of Cases |
63119 |
|
|
Number of Returned Advance Letters |
2703 |
4.3% |
|
|
|
|
|
Number Coded as Moved Left No Address (Code 08) |
1013 |
|
|
Completed |
71 |
7.0% |
Leave as Complete |
Refused |
92 |
9.1% |
Leave as Refused |
Non-Contact |
78 |
7.7% |
Reassign |
Other |
21 |
2.1% |
Reassign |
Not Eligible |
399 |
39.4% |
Correctly Assigned |
Unknown |
352 |
34.8% |
Reassign |
Total Number to be Reassigned |
451 |
|
|
Average Number per Panel |
15.0 |
|
|
Average Increase in Response Rate |
0.46% |
|
|
The mail return code 08 could indicate a mover, but it could also mean that the address we have for the case is incorrect or insufficient. As with mail return code 06, cases that received this outcome are most likely to receive a final outcome of Not Eligible (39.0%) or Unknown (34.8%). Looking at the final outcomes, 86.0% of the Not Eligible cases that received mail return code 08 received a final outcome of 021.003, Mover. Of the cases that were Unknown, 83.2% were assigned outcome 022.002 (Unknown eligibility: sample unit not found/unreached). Based on this information, mail return code 08 appears to be a good indicator that the designated person moved. To further research mail return code 08, we looked at the addresses that the USPS returned. If the USPS wrote a new address on the package, the NPC captures the address into the ATAC system. We compared these addresses to the original mailing address, and found that in most cases the addresses were not similar—different street, city, zip code, and even state. We were interested in the completed cases that had received mail return code 08 because this seemed to suggest that movers were interviewed. Upon review, we found that in most cases the USPS-provided address was similar to the original address, or was a P.O. Box. So it appears that the designated person was interviewed while residing at the same address.
As with mail return code 06, we recalculated the response rate after reassigning outcomes for cases that received mail return code 08. As indicated in Table 5, 39.4% of the time the return mail code corresponds to the final outcome of Not Eligible. The remaining 451 cases having final outcome codes of Non-Contact, Other, or Unknown Eligibility were reassigned as Not Eligible. This reassignment reduced the number of eligible cases each month by an average of 15.0 cases and therefore increased response rates by an average of 0.46% from an average of 53.0% to 53.5%. This demonstrates that even converting all mail return code 08 cases (less completes and refusals) to Not Eligible would not result in a significant gain. Further, this would rely on information supplied by the Post Office and it is not clear whether this information is necessarily accurate.
Incentive case return rates
Table 7 Summary of USPS Returned Mailings
(April 2005 – September 2007 Panels)
|
Non-Incentive Cases |
Incentive Cases |
Total Cases |
Incentive Percent |
Number of Cases |
59651 |
3468 |
63119 |
5.49 |
Number of Completed Diaries (001,002) |
30525 |
1084 |
31609 |
|
Percent Completed Diaries |
51.2 |
31.3 |
|
|
|
|
|
|
|
Advance Letter Returned |
2388 |
315 |
2703 |
11.65 |
Percentage Returned |
4.00 |
9.08 |
4.28 |
|
Completed Diary Interviews |
397 |
0 |
397 |
|
Completed Diary Interviews % |
16.62 |
0.00 |
14.69 |
|
|
|
|
|
|
Postcard Returned |
2691 |
383 |
3074 |
12.46 |
Percentage Returned |
4.51 |
11.04 |
4.87 |
|
Completed Diary Interviews |
565 |
39 |
604 |
|
Completed Diary Interviews % |
21.00 |
10.18 |
19.65 |
|
|
|
|
|
|
USPS Sent NPC Address Correction |
1712 |
149 |
1861 |
8.01 |
Percentage Corrected |
2.87 |
4.30 |
2.95 |
|
Completed Diary Interviews |
162 |
17 |
179 |
|
Completed Diary Interviews % |
9.46 |
11.41 |
9.62 |
|
|
|
|
|
|
NPC Postalsoft Rejected Addresses |
963 |
93 |
1056 |
8.81 |
Percentage Rejected |
1.61 |
2.68 |
1.67 |
|
Completed Diary Interviews |
450 |
23 |
473 |
|
Completed Diary Interviews % |
46.73 |
24.73 |
44.79 |
|
The percentage of incentive cases that complete the survey is only 31.3% compared to 51.2% for non-incentive cases. Incentive cases also have a higher “return to sender” rate of the two main ATUS correspondence types: the advance letter and the postcard reminder. The USPS returns the advance letter of the non-incentive cases 4.0% of the time, while incentive advance letters are returned 9.1% of the time. Similarly, the USPS returned the postcard reminder at a slightly higher rate than the advance letter but with the same proportion (1:2) between non-incentive and incentive cases, 4.5% and 11.0% respectively. Since the advance letter and reminder postcard are the only communication links with an incentive respondent, it is impossible for the interview to be completed without the respondent receiving the material.
3. Conclusions
Summary
The following is a summary of our findings:
Incentive cases are twice as likely as non-incentive cases to have the advance letter returned, 4.0% and 9.1% respectively;
Non-incentive cases that have the advance letter returned are three times less likely to have completed the survey than those cases without a returned letter, 16.6% and 51.2% respectively;
Converting all returned mail cases currently coded as eligible non-interview to not eligible outcomes would only result in a 1.22% improvement in the overall response rate.
The mail return code 06 usually corresponds to a final outcome of Unknown Eligibility. When it is assigned to Not Eligible, it is most often specifically coded as a mover.
Converting only returned mail code 06, Mover, from eligible non-interview to not eligible outcomes would only result in a 0.10% improvement in the overall response rate.
The mail return code 08 usually corresponds to a final outcome of Not Eligible. The majority of the time it is specifically coded as a mover.
Converting the most common returned mail code 08 from eligible non-interview to not eligible outcomes would only result in a 0.46% improvement in the overall response rate.
While we were unable to directly measure the input review process, we were able to see that the Postalsoft program seems to identify cases that are likely to be returned. We have revised our input review process to focus more on reviewing the Postalsoft rejects. In addition, because return rates are higher for incentive cases, we are devoting more time to the address review of these cases in an effort to increase the likelihood of participation.
As expected, cases that do not receive the mailings are much less likely to complete the survey. A non-incentive case is more than three times less likely (51.2% vs. 16.6%) to have completed the survey if the USPS returned the advance letter. For non-incentive cases, this supports the premise that the advance materials influence participation. Receiving the advance materials is vital for incentive cases, since they can only make contact with the telephone center using the information they receive on the advance letter. Incentive cases may already be less likely to participate in the survey, and the higher mail return rate exacerbates this already low response rate.
File Type | application/msword |
File Title | Introduction |
Author | posey003 |
Last Modified By | lacey_j |
File Modified | 2009-02-04 |
File Created | 2009-02-04 |