Download:
pdf |
pdfTest
Address Validation
Testing
Date
Objective ID
ADC1
Qs from FY13/Fy14
Bus Plans
D.f, D.g,
Objectives/Goals from Test Plans
Results of Objective from Results Reports
Operational Decisions Made (from Operational Plan)
Objective was to evaluate our methods
for a reengineered address canvassing
Statistical models we applied were not effective at (a) identifying specific blocks
with many Adds or Deletes, or (b) predicting national totals of MAF coverage
errors
Showed that Partial Block Canvassing methodology offers the potential to
implement a more efficient approach to canvassing
o In-Office Address Canvassing
o In-Field Address Canvassing
o Quality Control
o MAF Coverage Study
Address Canvassing.
Address Validation
Testing
ADC2
D.a,
Objective was to test how well in-office
procedures can replace in-field
procedures
Demonstrated the utility of imagery review to guide decision-making and
operational planning for address canvassing
Demonstrated the value of fieldwork to gather information for use in assessing
the effectiveness of in-office methods
o In-Office Address Canvassing
o In-Field Address Canvassing
o Quality Control
o MAF Coverage Study
to validate addresses within each block
Address Canvassing.
Address Validation
Testing
ADC3
Address Validation Sept 2014 - ADC4
Test: Part 1 - MAF Dec 2014
Model Validation
Test
Address Validation Sept 2014 - ADC5
Test: Part 1 - MAF Dec 2014
Model Validation
Test
D.f, D.g, D.h
Objective was to assess our ability to
ensure an accurate Master Address File
(MAF)
D.c,
Summary of results: The statistical models we applied were not effective at …
to collect data to inform components of > identifying specific blocks with many Adds or Deletes
the Targeted Address Canvassing
> predicting national totals of MAF coverage errors
decision-points
MAF Error Model
Results for Statistical Models:
Targeted Address Canvassing, Research, • Determining specific blocks that need additional action:
Model, and Classification team
∙ rate of error capture was too low
Models for Zero Living Quarters blocks ∙ rate of erroneous canvass was too high
• Using statistical models to predict national totals of coverage errors on the
MAF:
∙ model parameters reflected condition of MAF in 2009
∙ now: only halfway through decade, and MAF has improved under Geographic
Support System Initiative
D.a,
Concept test Micro-Targeting and uses
of Aerial Imagery
Statistical models were ineffective at measuring MAF coverage error
Ongoing research will focus on collecting metrics via the MAF Coverage Study
decade.
using a dependent canvass (from ground to list).
structure with a living quarter.
Address Canvassing
Based on weighted results of imagery review for the 10,100 MMVT blocks:
• 84% of blocks with at least one address are stable.
canvassed in the field
• These blocks encompass an estimated 85% of all housing units.
• These blocks would be placed in a “passive” category, with ongoing monitoring
for change, but not requiring active processing to acquire updates.
• 15% of housing units are located in “active” blocks, with updates acquired
through the USPS’ Delivery Sequence File, local government partner files, other
administrative or commercial address lists, or fieldwork.
Address Validation Dec 2014 - ADC5
Test: Part 2 Feb 2015
Partial Block
Canvassing
D.a,
Address Validation Dec 2014 - ADC6,7
Test: Part 2 Feb 2015
Partial Block
Canvassing
Address Validation
Test: Part 2 Partial Block
Canvassing
Address Validation
Test: Part 2 Partial Block
Canvassing
Dec 2014 - ADC7
Feb 2015
Dec 2014 - ADC8
Feb 2015
Small-Scale Testing January
2014 and
ongoing
CFD1
D.b, D.d, D.e,
Test ability to navigate to targeted
Test Analysis of Imagery Review Results to Inform In-Office Canvassing:
area/coordinate using locational
information produced based on in-office • Based on weighted results of imagery review for the 10,100 MMVT blocks:
review of imagery.
• 84% of blocks with at least one address are stable.
• These blocks encompass an estimated 85% of all housing units.
• These blocks would be placed in a “passive” category, with ongoing
monitoring for change, but not requiring active processing to acquire updates.
• 15% of housing units are located in “active” blocks, with udpates acquired
through the USPS’ Delivery Sequence File, local government partner files, other
administrative or commercial address lists, or fieldwork.
Address Canvassing
Collect specified information for use in
comparison to information collected for
the same block through full block
canvassing in the MAF Model Validation
Test.
Collect metrics to measure efficiency,
cost, etc.
Identify potential issues affecting ability
to conduct fieldwork and collect
accurate information:
∙ Is imagery required in the field? What
other tools/data are needed in the field?
∙ Should updates other than those
specified be collected?
∙ How do we limit the scope of work
once in the field?
Explore different formats and content to From the opt-in, non-probability panel:
email, text and automated voice
response
invitations
1. A text based email out performed graphical emails.
2. Short email subject lines that include the “10-minute” burden and the “U.S.
Census Bureau” name seem to perform better than other subject lines, especially
those including the word “Help” as the first word in the subject line.
3. Longer email content with “Dear Resident” and signature of the director
similar to the 2014 Census Test email outperformed a shorter email invitation
without the greeting and signature.
4. Response rates did not differ by link type (whether the full URL or “Click here”)
with this population.
5. The time of day the email is sent did not appear to have a big impact on the
response rate.
6. Respondents report preferring reporting online to a decennial census with a
mailed invitation with the link over all other options.
2012 National
Census Test
2014 Census Test
CFD2
CD: 6/1/14 CFD3
Evaluate the performance of combined • Response Distributions
race and origin questions on the Internet ∙ Similar across the two question versions
∙ Item nonresponse lower in the two-part version than the one-part version
• Detailed Reporting
∙ Some differences across the two question versions
∙ Noticeably less detailed reporting in 2012 National Census Test Internet than in
2012 National Census Test paper
∙ Noticeably less detailed reporting in 2012 National Census Test Internet than in
2010 Alternative Questionnaire Experiment (AQE) (paper)
• Results did not indicate expected benefit of enhanced reporting of detailed
race and origin groups
∙ Additional research needed
• Predictive Text
∙ Hypothesis: Decrease typos and extraneous characters; Lower rate of residual
coding
∙ Results not as expected: NCT resulted in a relatively higher rate of residual
coding compared to 2012 National Census Test paper responses
This site test was not focused on
evaluating content as a main objective
2015 National
Content Test
CFD5
Use nationally representative sample to
evaluate and compare different census
content: race/origin, relationship, and
coverage
2015 National
Content Test
CFD6
Conduct a reinterview to measure
accuracy of race/origin and coverage
2012 National
Census Test
CQA1
Assess the Telephone Questionniare
Assistance (TQA) workload
Major Findings
∙ Use of combined race/Hispanic origin question, compared to separate
questions, showed no difference in distribution for most groups
∙ Soliciting write-in race and origin details on a separate screen from the major
group checkboxes, compared to on the same screen, results in more detailed
reporting
∙ Detailed reporting for major race and Hispanic origin groups varied by question
version – combined question saw higher percentages for White, Black, and
Hispanic, and lower for Asian and NHOPI
∙ Use of the new relationship question, which includes categories for same-sex
and opposite sex spouse and partners, showed no difference in distributions for
each category, though the paper form had slightly higher item nonresponse for
the new version
• TQA available throughout data collection
∙ Agents answered questions and took interviews
on the questionnaire.
• 6,226 calls to TQA (roughly 8% of sample)
∙ 65% resulted in interviews
2012 National
Census Test
CQA2
Assess the Telephone Questionniare
Assistance (TQA) reasons for calls
Census processes and frequently asked questions.
• Reasons for calls
∙ Reasons recorded for 81% of calls
problems, lack of access to Internet) by offering to complete the
∙ 69% of those were because respondent did not have computer and/or Internet 2020 Census questionnaire instead of offering technical assistance
access
to respondents.
2012 National
Census Test
ISR1
A.a, A.b,
Assess relative self-response rates and
Internet self-response rates across
various contact strategies
2015 National
Content Test
ISR1
A.a, A.b,
Refine estimates of national selfresponse and Internet response rates
and continue testing contact strategies
for optimizing self-response
2012 National
Census Test
ISR2
A.e,
Assess relative self-response rates and
Internet self-response rates utilizing
Internet Push methodology
• Second Reminder:
∙ Performed well, across multiple treatments
∙ Sending 2nd reminder prior to mailing a paper questionnaire resulted in
significant gains in both overall self-response and Internet response; increase in
telephone interviews
• Advance Letter:
∙ No significant difference in overall self-response compared to No Advance
Letter
• Telephone Number in Initial Mailing:
∙ No significant difference in overall response
∙ Increase in telephone interviews
∙ Operationally inefficient not to include
• Content Tailored to Nonrespondents:
∙ No significant difference in overall response
∙ Recommend continued research
Census.
participation in the Census. Contacts may include some of all of
the following: postcard mailings, letter mailings, emails, text
messages, pre-recorded telephone messages, questionnaire
mailings, and in-person visits by an enumerator.
respond using a unique Census identifier; however, the 2020
Census will allow people to respond without a unique Census ID.
number of languages other than English and Spanish, including
those requiring non-Roman alphabets. The languages selected
will be based on national prevalence rates of low-English
proficiency households and the available technology.
Census.
participation in the Census. Contacts may include some of all of
the following: postcard mailings, letter mailings, emails, text
messages, pre-recorded telephone messages, questionnaire
mailings, and in-person visits by an enumerator.
respond using a unique Census identifier; however, the 2020
Census will allow people to respond without a unique Census ID.
number of languages other than English and Spanish, including
those requiring non-Roman alphabets. The languages selected
will be based on national prevalence rates of low-English
proficiency households and the available technology.
2013 National
Census Contact
Test
ISR3
A.c, A.e.iii, A.g,
Contact Frame Quality: Evaluate the
quality of phone and email contact
information acquired from commercial
sources
• Ability to determine the quality of the supplemental contact frame was limited
due to respondents not being willing to share all of their available phone
numbers and email addresses
• The Contact Frame team learned a limited amount about phone numbers and
characteristics of those who were likley to respond to the phone survey, but less
about email addresses. More research is needed for those areas
The Contact Frame team is developing the following future research goals as a
result of the 2013 National Census Contact Test analysis:
• Develop an optimal prioritization algorithm to order the phone numbers most
likely to have a correct phone- residential address link.
• Conduct phone number and email address verifications with other available
Census surveys such as:
o 2010 Census Coverage Followup Operation
o 2010-2012 American Community Survey
o 2012 National Census Test.
• Conduct analysis of phone-residential address links at lower levels of
geography (state, county, block, and tract-levels) which will require looking at
larger datasets.
• Investigate other phone number and email address sources (commercial or
government/administrative records sources) to improve the demographic and
geographic coverage of phone numbers and email addresses.
• Conduct analysis of residential address-email address links. Perhaps, develop a
prioritization algorithm to order the email addresses most likely to have a correct
residential address-email address link. Also, analyze these links at lower level of
geography.
2014 Census Test
CD: 6/1/14 ISR4
A.c, A.d,
• “Notify Me”
∙ Postcard solicitation
∙ Respondents select their preferred
mode for future invitations and
reminders – email or text message
Low participation in “Notify Me”
3% of invitees participated
2014 Census Test
CD: 6/1/14 ISR5
A.c, A.d, A.e.iii,
• Email invitation
∙ Test use of email as initial invitation to
respond
∙ Evaluate use of pre-notices (letter and
automated voice) to introduce and
legitimize email contacts
• Mail Internet invitation
∙ “Internet Push” strategy: letter →
postcard → postcard → questionnaire
∙ Test use of email and automated voice
reminders
Email not an effective replacement for postal mail
Over half of the emails were not delivered - “bounced back”
Response rates 10% lower than control
2014 Census Test
CD: 6/1/14 ISR6
A.e,
Internet-push is successful strategy for generating Internet response
50.6% of total response was via Internet
76.8% self-response was via Internet
Automated Voice Invitations (AVI) show no impact on response
When used as pre-notice or as a reminder
respond using a unique Census identifier; however, the 2020
Census will allow people to respond without a unique Census ID.
Internet push letter inviting response to the Census for those
areas with Internet access or a paper questionnaire for targeted
populations without Internet access (under review ).
received will be mailed a paper questionnaire.
2015 OSR
2015 OSR
2015 Census Test
ISR7
ISR8
CD: 4/1/15 ITIN4
A.c,
Continue efforts to increase SelfResponse through research and testing
of communications strategies prior to
awarding a communications contract
• Test the use of digital targeted
advertising methods to engage and
motivate respondents
• Assess effectiveness of early
announcement offer (“Notify Me”) when
paired with advertising
A.c, A.d,
Improve the usability and respondent
experience with improved Internet
response functionality
• Provide a mobile-optimized application
for Internet self-response
• Study the extent to which encouraging
responses without a Census ID will
contribute to the national Self-Response
and Internet response rates (“Non-ID”)
C.e, C.g, C.h, C.j
Test operational implementation of the
Bring Your Own Device (BYOD) option
for enumeration:
∙ Design, develop, deploy and support
secure software solutions that can be
installed on an employee’s personally
owned mobile device.
∙ Conduct interviews with respondents
using the enumerator-owned mobile
device.
∙ Capture lessons learned for future
operations, including focus groups with
a subset of the respondents,
questionnaires for the enumerators, and
collect feedback from the local census
office.
• Preliminary Self-Response Resutls (Mail Panel Design):
> In control panel, weighted 47.9 percent of sample has responded
> Significantly lower internet and total response rate for Internet Push without an ID and
Notify Me postcard panels
• Preliminary Self Response Results (Other responses in Savannah):
> Additional postcard mailing resulted in about 8.3 percent response
> Outside of the mail panels, more than 35,000 non-ID responses received – due to
advertising and promotional efforts
• Preliminary Self Response Results (Notify Me):
> Low participation
• 1,925 participants “pre-registered”; of these 1,341 signed up before the cutoff date and
were matched, and of those 1,203 were in the Savannah area
• Majority selected email as their preferred contact mode
• 93.0 percent of Notify Me participants ultimately responded
> Additional burden may depress response
Internet push letter inviting response to the Census for those
areas with Internet access or a paper questionnaire for targeted
populations without Internet access (under review ).
respondents without Internet access.
and Communications Campaign.
participation in the Census. Contacts may include some of all of
the following: postcard mailings, letter mailings, emails, text
messages, pre-recorded telephone messages, questionnaire
mailings, and in-person visits by an enumerator.
• Preliminary Self Response Results (Non-ID Processing):
> Address matching and geocoding occurred in real time (during response) and telephone agent-assisted response.
then for addresses that did not match, another attempt was made after
Administrative Records data were used to update the addresses by correcting or geocoding, post real-time processing that will utilize
adding address elements.
administrative records and third-party data, and manual
(interactive) matching and geocoding.
LUCA Focus
Groups
March
LUCA1
2014 - June
2014
D.i
Identify changes that help increase
participation and coverage, while
decreasing program costs for the 2020
Census LUCA Program
Recommendation 2: Reduce the complexity of the LUCA program
Recommendation 5: Provide the address list in more standard formats
Recommendation 6: Include an in-office verification of LUCA submitted
addresses (Research Activity 3)
Recommendation 7: Utilize GSS-I tools and data to validate LUCA submission
(Research Activity 2)
Recommendation 8: Encourage governments at the lowest level to work with
larger governments to consolidate their submission
Recommendation 12: Continue the 2010 Census LUCA Program improvements
that were successful: (Research Activity 1)
> Expanded the review time for participants from 90 days to 120 days
> Provided more advance notice of the pending LUCA program
> Initiated comprehensive program communications with partners
> Provided participants with the opportunity to use the Census Bureau supplied
MAF/TIGER Partnership Software (MTPS) application
> Invited states to participate in the program
Address Canvassing will validate LUCA submissions
Validation of LUCA submissions will occur during primarily during
In-Office Address Canvassing, with minimal validation occurring
early in the In-Field Address Canvassing operation
participants
and allow partners to return their structure coordinates as part of
their submission
LUCA materials
lists are easier to load into common software packages
governments to consolidate their submissions
with the Geographic Support System Initiative (GSS-I) and Address
Canvassing
validation process
support automated exchange of information for LUCA participants
LUCA Focus
Groups
March
LUCA2
2014 - June
2014
D.i
Identify ways to improve the quality of
address updates for the 2020 Census
LUCA Program
Recommendation 1: Eliminate Option 1 (Title 13 Full Address List Review) and
Option 2 (Title 13 Local Address List Submission) full address list submission
Recommendation 3: Include census structure coordinates in the census address
list and allow participants to return their structure coordinates as part of their
submission (Research Activity 2)
Recommendation 4: Provide ungeocede United States Postal Service Delivery
Sequence File addresses to State and County partners
Recommendation 7: Utilize GSS-I tools and data to validate LUCA submission
(Research Activity 2)
Recommendation 8: Encourage governments at the lowest level to work with
larger governments to consolidate their submission
Recommendation 9: Eliminate the Block Count Challenge (Research Activity 3)
Recommendation 10: Require unit designators for multi-unit structures
(Research Activity 3)
Recommendation 11: Encourage LUCA participants to identify E-911 Addresses
used for mailing, location or both
Address Canvassing will validate LUCA submissions
Validation of LUCA submissions will occur during primarily during
In-Office Address Canvassing, with minimal validation occurring
early in the In-Field Address Canvassing operation
participants
and allow partners to return their structure coordinates as part of
their submission
LUCA materials
lists are easier to load into common software packages
governments to consolidate their submissions
2013 National
Census Contact
Test
NID1
D.k
Improving Non-ID Processing: Test
proposed enhancements to automated
processing of census responses lacking a
preassigned census identification
number
• Address enhancement influenced a significant number of address records, and thus warrants
further research
• The more steps the Non-ID Processing team took to enhance the addresses, the higher the
matching and geocoding rates
• Internal Revenue Service address data has great potential for use in automated processing,
particularly for confirming the respondent has given us a good address, but also fro use in
supplementation
• The 2020 Non-ID Processing team needs to continue to refine the address enhancement
process throughout the research and testing phase, utilizing data from field tests while also
reprocessing 2010 Census data through the address enhancement process
telephone agent-assisted response.
2020 Non-ID workflow.
Recommendations:
• Continue to explore the possibility of acquiring authorization for using Internal Revenue
Service address data for administrative records matching.
• Explore utilizing the respondent-provided version of the address over the administrative
record version for subsequent processing when an address is confirmed by an administrative
record address.
• Explore using DataFlux to update ZIP codes and street names only if possible, since DataFlux
introduces some undesirable effects on some addresses (e.g., moving rural route information
to the city-style address fields). The administrative record database must be standardized the
same way.
• Explore using the GEO Standardizer to standardize individual address fields; that is, do not
use the whole address standardizer. However, the whole address standardizer does a good job
of parsing city-style addresses that were entered in one field during data collection. Therefore,
if we can identify those cases with city-style address information in one field at the onset, we
could potentially use the whole standardizer exclusively on them, and the individual address
component standardizer on the rest.
• Identify noncity-style addresses prior to standardization and do not update rural route
2014 Census Test
CD: 6/1/14 NID2
2015 OSR
NID3
D.k
• Non-ID Internet response
∙ No user ID provided in mail materials
∙ Test ability to process and match
respondent-provided address
information (not real-time)
Respondent Validation to authenticate
people/addresses who self-responded
with or without a Census ID
2015 Census Test
CD: 4/1/15 NRFU-??
Reduce Nonresponse Followup (NRFU)
workload and increase NRFU
productivity with Administrative
Records, Field Reengineering, and
Adaptive Design.
2015 Census Test
CD: 4/1/15 NRFU-??
2013 Census Test
Dec-13
Evaluation Follow-Up:
∙ Obtain the most accurate status of the
housing unit on Census Day
∙ To identify people associated with an
occupied housing unit during the
calendar year
An operational study of NRFU
procedures
• Lower Internet response for Non-ID Panel
∙ 40.6% for Non-ID panel vs. 46.3% for ID panel
• Lower overall response for Non-ID
∙ 58.9% for Non-ID panel vs. 61.4% for ID panel
• Response rates are impacted by the ability to match
∙ About 5% of Non-ID cases weren’t matched
∙ All unmatched cases treated as nonrespondents
• Address collection in the Internet instrument appears to be successful
∙ Higher match rates than 2010 and fewer incomplete records
• Address supplementation from administrative records isn’t necessary very
often, but increases matching rates by 50% when used
telephone agent-assisted response.
2020 Non-ID workflow.
geocoding, post real-time processing that will utilize
administrative records and third-party data, and manual
(interactive) matching and geocoding.
2013 Census Test
2013 Census Test
Dec-13
Dec-13
NRFU1
Use administrative records to
“enumerate” some housing units
NRFU2
2013 Census Test
Dec-13
NRFU2
2013 Census Test
Dec-13
NRFU3
Try an adaptive design approach for
cases not enumerated with records
Compare with a fixed enumeration
approach
Examine two telephone methods
• Interviewers were approximately 20% less efficient when workload was
reduced with records
• Cases remaining after workload is reduced are more difficult
• But interviewers spent approximately 22% fewer hours
• Overall interviewer cost is reduced
• Interviewers were 22% more efficient in the adaptive design treatments
• This pattern holds whether workload was reduced with records or not
• Interviewers in the adaptive groups averaged approximately four more
contacts per interviewer/day
• CATI implementation before CAPI led to 12-14% decrease in productivity
• Combines CATI and CAPI hours
• Productivity =
(CATI hours+CAPI hours)/Number of cases
enumerate nonresponding housing units, as appropriate
for:
o Recruiting, onboarding, and training
o Time and attendance and payroll
o Case load management
o Data collection
o Cost and progress monitoring
Questionnaire Assistance) will not be part of the initial NRFU
contact strategy
2014 SIMEX Test
Nov-14
NRFU4
B.j,
Objective #1: Identify optimal staff-tosupervisor ratios/mixes for ENUM/LSO
and LSO/FMO.
• The Census SIMEX demonstrated that LSOs and FMOs were able to perform
tasks within the new staff-to-management structure at both high and low staff- management and staffing structure
to-management ratios; however, significant negative impacts to workload,
situational awareness, response times, and response accuracy were identified
when higher staff-to-supervisor ratios were utilized. Within the context of this
SIMEX, and because we only tested two staff-to-supervisor ratios (in conjunction
with several other variables), we were not able to determine an optimal or
absolute value for the staff-to-supervisor ratio.
2014 SIMEX Test
Nov-14
NRFU5
B.j,
Objective #2: Assess the necessity and
sufficiency of the automated operational
control system (MOJO) as it pertains to
the FMO/LSO management of staff,
response data, and payroll data in an
operational setting.
• The Census SIMEX demonstrated that the MOJO Operational Control System
(OCS) is sufficient in assisting both LSOs and FMOs in an operational setting.
Further, data from the SIMEX suggests that MOJO is essential in supporting the
new field staff structure and procedures described in the revised NRFU CONOPS.
LSO and FMO participants found MOJO to be acceptable for NRFU operations,
and all participants were able to take advantage of different capabilities and
functions provided by MOJO and utilize them to successfully support their roles
in an operational setting.
2014 SIMEX Test
Nov-14
NRFU6
B.j,
Objective #3: Evaluate and evolve the
• Overall, data from the Census SIMEX suggested that roles and responsibilities
FMO and LSO responsibilities and duties of the FMOs and LSOs were well defined in the revised NRFU CONOPS.
as defined in the revised NRFU CONOPS. Participants mostly understood the duties associated with their role and tasks
they were engaged in, although there was, on occasion, some confusion about
what steps to take or which procedures to follow for certain complex events.
management and staffing structure
2014 SIMEX Test
Nov-14
NRFU7
B.d, B.e, B.j, B.l
Objective #4: Evaluate the effectiveness • The Census SIMEX revealed that the initial set of training materials provided to
of the training materials for the FMO
SIMEX participants in preparation for their roles as FMOs and LSOs appeared to
and LSO roles.
be generally effective. Performance data (response accuracy) suggested that
there may have been some knowledge gaps among LSOs with regard to certain
procedures. Further analysis of the source of incorrect completions would be
needed to confirm that additional training on these procedures in particular is
required. Feedback from participants overall suggested that they would like
training to be longer, more in-depth, and inclusive of all roles across NRFU
operations. Although the training was generally effective for the purposes of the
SIMEX, both objective and subjective data suggests that continued development
of the training program, as well as the addition of modules covering specific focal
areas such as the Census Operations Mobile Platform for Adaptive Service
Solutions (COMPASS) application, detailed MOJO components, and CONOPS
procedures for off-nominal events may be needed.
2015 Census Test
CD: 4/1/15 NRFU-8
Test the feasibility of fully utilizing a field
operations management system that
leverages planned automation and
available real-time data, as well as data
households have already provided to the
government, to transform the efficiency
and effectiveness of data collection
operations.
Public Opinion
Polling
February
2012 and
ongoing
Track public opinion toward the Federal > Data users are more likely to report trusting statistics than non-data users.
Statistical System
> Reported belief in the credibility of statistics predicts reported trust in federal
statistics.
> Reported trust in statistics remained relatively stable over the two year data
collection.
> The government shut-down caused by a deadlocked Congress coincided with
the largest dip in reported trust in statistics (however, this may have been
confounded by distrust of the roll out of the Affordable Care Act).
SPC1
for:
o Recruiting, onboarding, and training
o Time and attendance and payroll
o Case load management
o Data collection
o Cost and progress monitoring
Public Opinion
Polling
February
2012 and
ongoing
SPC2
F.b,
Track opinions toward use of
Administrative Records
> Questions regarding administrative record use has shown when framed to
indicate that the use of records can save the government money or provide a
social good then respondents are more likely to favor using administrative
records. Findings also seem to indicate that respondents prefer the use of
government records to “public” or other third-party records.
identify vacant units
enumerate nonresponding housing units, as appropriate
From Focus Groups:
> Participants became more comfortable with the idea of administrative records
use when more information about how it will be used and from where the data
will be obtained is provided. However, participants were concerned with the
accuracy and timeliness of administrative records, leading some to conclude that
it might be best if the Census Bureau avoid using them.
Overall Assessments (from Privacy/Security):
> Framing of communication surrounding the use of administrative records is of
utmost importance. Respondents want to know what is being done and why.
> People who use Federal data and know something about it are more likely to
trust it, trust us, and favor the use of administrative records for statistical
purposes.
Public Opinion
Polling
February
2012 and
ongoing
LUCA Focus
Groups
March
2014 June 2014
LUCA Focus
Groups
March
2014 June 2014
SPC3
F.b,
Other topics like BYOD, contact
methods, and response methods
> Respondents reported preferring to be contacted by email or text messaging
over being called on a cell phone. Respondents reported preferring email to
response
traditional contacts of calling the home phone or in-person interviewing.
However, respondents report preferring reporting via paper for the decennial
Census compared to receiving an email with the link.
> If respondents objected to being emailed or texted, the most common reason
was that they don’t use that communication.
From Focus Groups:
> Focus group participants disliked the idea of receiving text messages from the
Census Bureau. Some participants were open to receiving an email, but were
unsure how they could verify its legitimacy. They were worried scammers might
take advantage of an opportunity to impersonate the Census Bureau and that
their information could fall into the wrong hands.
> Most people whose households responded during nonresponse follow-up
(NRFU) strongly disliked the concept of Bring Your Own Device (BYOD), citing
security, privacy, and confidentiality concerns. However, these participants were
unable to identify how they would know if a device were personal or
government-issued.
> Answering the census online was not an issue for most participants, but some
worried about the security of the systems and mentioned events such as Edward
Snowden’s leak of National Security Agency (NSA) information and the health
insurance website issues following the Affordable Care Act as cautionary tales.
> Most participants in the groups had experienced Internet-targeted advertising
and indicated they found it annoying. While there was some initial confusion
about how this concept related to the Census Bureau, most people were able to
reason through how it might be useful. The use of a celebrity or athlete created a
Research Activity 1 - Researched reports mostly negative response, with participants saying it is risky and that they would
and documents associated with the
2010 Census LUCA and 2010 New
Construction (NC) programs
Research Activity 2 - Researched the
impact of the Geographic Support
System Initative (GSS-I) on LUCA
LUCA Focus
Groups
March
2014 June 2014
Research Activity 3 - Researched the
impact of the Reengineering Address
Canvassing on LUCA address validation
LUCA Focus
Groups
March
2014 June 2014
Research Activity 4 - Conducted Foucs
Groups to obtain feedback from
partners on potential 2020 Census LUCA
models
Address Validation
Test: Part 2 Partial Block
Canvassing
PBC Key Take-Aways:
• When PBC did not find an address that was located by MMVT, reasons for the
omission tended to be:
• The area was provided to the PBC lister, but the instruction was poorlyworded or the polygon was poorly-defined, leading to lister confusion.
• The add represented a situation not detectable by the imagery review step
(i.e., changes within existing structures), and therefore was not provided as a
work area to the PBC lister.
• The add represented a situation that was not detected due to imagery quality
and/or vintage issues, and therefore was not provided as a work area to the PBC
lister.
• Polygons and instructions prepared for PBC listers generally resulted in
successful navigation to and within work areas, and facilitated accurate data
collection, but improvements need to be made:
• Ensure instructions match the polygon;
• Include imagery on the LiMA to aid in understanding the polygon and
instruction;
• Use basic street address information within an instruction; and;
• Missing and misaligned street features and misaligned block boundaries
should be fixed in the office before any block goes to the field.
• Based on the results of the PBC Test, we recommend:
• Testing PBC in the 2016 Address Canvassing Test with traditional listers.
• Overlap with full-block canvassing for a sample of blocks to compare results.
• Improve clarity of written instructions as well as training to minimize lister
confusion in the field.
• Conduct additional analysis at the individual address level to fully understand
differences between MMVT and PBC listing results and imagery review results.
File Type | application/pdf |
Author | Melissa Erin Baker |
File Modified | 2016-01-12 |
File Created | 2015-10-23 |