Download:
pdf |
pdf2017 Puerto Rico Census Test (PRCT)
Goals, Objectives, Success Criteria (GOSC) and Research Questions
May 4, 2016 – Baseline Version 1.0
Contents
List of the Operations Participating in the Test and Program Managers ................................................................................................................................................................................................... 2
2017 Puerto Rico Census Test - Diagram .................................................................................................................................................................................................................................................... 3
2017 Puerto Rico Test Timeline .................................................................................................................................................................................................................................................................. 4
Key Information.......................................................................................................................................................................................................................................................................................... 5
Test Focus ................................................................................................................................................................................................................................................................................................ 5
Census Day .............................................................................................................................................................................................................................................................................................. 5
Scope and Limitations ............................................................................................................................................................................................................................................................................. 5
Operations ............................................................................................................................................................................................................................................................................................... 5
Workloads ............................................................................................................................................................................................................................................................................................... 5
Overall Test Assumptions ........................................................................................................................................................................................................................................................................ 6
Site Selection Decision ............................................................................................................................................................................................................................................................................ 7
Test Focus Operations – Goals, Objectives, Success Criteria, Research Questions ................................................................................................................................................................................... 8
Support Operations – with Goals, Objectives, Success Criteria and Research Questions ....................................................................................................................................................................... 16
Support Operations Only – Summary of How the Operation will Support the Test .................................................................................................................................................. 22
Page 1 of 27
List of the Operations Participating in the Test and Program Managers
Operation/Support
Address Canvassing*
Geographic Programs*
Non-ID Processing
Update Enumerate*
Content and Forms Design
Language Services
Puerto Rico Enumeration*
Internet Self-Response*
Program Manager
Evan Moffett
Will Caldwell (Acting)
Integrated Partnerships and
Communications
Tasha Boone
Decennial Translation Office
Jason Kopp
Nonresponse Followup*
Maryann Chapin
Decennial Service Center
Response Processing
Raphael Corrado
FormsPrinting and Distribution
Paper Data Capture
Field Infrastructure
Decennial Logistics Management
Census Questionnaire Assistance
Alexa Jones-Puthoff
Systems Engineering and Integration
Security, Privacy and Confidentiality
IT Infrastructure
Pete Boudriault
2017 Puerto Rico Census Test
Deirdre Bishop
Page 2 of 27
2017 Puerto Rico Census Test - Diagram
Page 3 of 27
2017 Puerto Rico Test Timeline
2017 Puerto Rico Test – Timeline (05/04/16)
4/1/2017
Census Day
11/7/2016
OMB Approval
4/15/2016
Site Selection
announcement
3/15/17
Begin CQA
9/2/2016
Full Package
to OMB
3/22/2017
Begin
Self-Response
5/31/2016
Pre-submission
to OMB
3/31/2016
Finalize
GOSC & Test Plan
2/23/17
Begin DSC
Support
9/1/2016 - 10/15/2016
ADCAN - In-Office
4/16
5/16
6/16
7/16
8/16
9/16
10/16
11/16
12/16
1/17
2/17
3/17
5/12/2017
Begin NRFU
4/7/2017
Begin NRFU
Training
4/17
5/17
3/1/16
6/17
7/17
7/31/17
1/17/17
Begin In-FLD
ADCAN
Training
9/1/2016
Open NY RCC
1/1/2017
Open PR ACO
& Begin Recruiting
FLD staff
Page 4 of 27
3/6/2017
Begin UE
Training
2/21/2017
Begin In-FLD
ADCAN
4/17/17
Begin UE
4/3/17
Begin In-FLD
ADCAN
Re-Listing
Key Information
Test Focus
Test the Address Canvassing operation in Puerto Rico (to begin on February 21, 2017 and end on March 31, 2017).
Integrate Self-Response, Update Enumerate (UE), and Nonresponse Followup operations in Puerto Rico, include components of reengineered quality control objectives for
Nonresponse Followup that were tested in the 2016 Census Test.
Test Spanish versions and the application of Puerto Rico address standards in all modes.
Census Day
April 1, 2017
Scope and Limitations
The Puerto Rico Census Test (2017 PRCT) results will be based on housing units selected from a particular local area, and cannot be generalized to the entire nation. The results do not
predict national trends or rate estimates expected in the 2020 Census. This test, however, will provide valuable indicators for contact and enumeration strategies in Puerto Rico as we
plan the 2020 Census for Puerto Rico.
Operations
Test Focus Operation - Goals, Objectives, Success
Criteria and Research Questions:
Workloads
Address Canvassing
Internet Self-Response
Update Enumerate
Nonresponse Followup
Support Operations - with Goals, Objectives, Success Criteria
and Research Questions:
Systems Engineering and Integration
Field Infrastructure
Non-ID Processing
HU count is about 123K
Estimated Self-Response workload 95,000 HU
Estimated UE Workload 28,000 HU
Estimated NRFU Workload ?
Page 5 of 27
Support Operations Only – Summary on how the operation will
support the test:
Geographic Programs
Forms Printing & Distribution
Paper Data Capture
Integrated Partnerships & Communications
Census Questionnaire Assistance
Response Processing
Content and Forms Design
Language Services
Decennial Service Center
IT Infrastructure
Decennial Logistics Management
Program Management
Overall Test
Assumptions
Universe:
NOTE: The following is a modified list of the assumptions provided in the Address Canvassing Test GOSC.
1. The 2017 PRCT will occur in one site within the San Juan metro area of Puerto Rico (Carolina Municipio, Loíza Municipio, and Trujillo Alto Municipio
2. The 2017 PRCT site will contain a variety of address styles, such as city-style addresses, non city-style address, and location descriptions
3. The 2017 PRCT site will contain a geographic area that has a high concentration of city-style addresses comparable to the San Juan Municipio used in the 2015 National
Content Test.
4. The 2017 PRCT site will include a municipio that borders the coast.
5. The 2017 PRCT sites will not contain military areas.
6. The 2017 PRCT will use the Basic Collection Unit (BCU) as the unit of geography to organize and manage work assignments.
7. The 2017 PRCT Address File will be refreshed with the latest version of the Delivery Sequence File.
In-Office Address Canvassing Assumptions:
1. In-Office Address Canvassing will review all BCU in the Puerto Rico Test, regardless of the Type of Enumeration Area.
In-Field Address Canvassing Assumptions:
NOTE: The following list is a copy of the assumptions provided in the Address Canvassing Test GOSC.
1. In-Field Address Canvassing will be conducted using Corporate Listing and Mapping Solutions (LiMA), Mobile Case Management, MOJO and UTS (i.e., CEDCaP systems).
2. In-Field Address Canvassing data collection will be conducted using a smartphone device.
3. In-Field Address Canvassing will conduct a second canvass of selected BCUs to perform a rudimentary quality estimation on those BCUs. If a large discrepancy rate is found
between the two listings on the same BCU, it may imply one of the two listings was of poor quality, or the block was difficult to list. This information may be of use when
resolving discrepancies between In-Office Address Canvassing and In-Field Address Canvassing.
4. In-Field Address Canvassing will not collect feature updates.
5. In-Field Address Canvassing results will update the MAF/TIGER database.
Other Address Canvassing Assumptions:
NOTE: The following list is a copy of the assumptions provided in the Address Canvassing Test GOSC.
1. In-Office Address Canvassing and In-Field Address Canvassing will inform the same management reporting system.
2. The Address Canvassing Integrated Product Team will review lessons learned from previous census tests and use them to guide the planning for the test if appropriate.
Internet Self-Response:
1. Combining concepts from the Self-Response Contact Strategy to create the 2017 Puerto Rico Census Update Enumerate contact strategy .
2. We will mail to addresses in the three Municipios that are deemed sufficient to mail out..
3. We will deploy both an Internet Push and Internet Choice strategy in the areas designated as ” Self-Response”
Page 6 of 27
4. Mailing package will provide both the Test Census URL and phone number for Census Questionnaire Assistance (CQA)
5. UE Mailing package will include a paper questionnaire
6. Notice of Visit form will provide both the Test Census URL and phone number for CQA
Update Enumerate:
1. Combining concepts from the 2010 Census Update/Leave and 2010 Census Update Enumerate operations to create a new operation for the 2020 Census.
2. No In-Field Address Canvassing for Update Enumerate areas.
3. We will mail to all addresses deemed sufficient to mail out.
4. Input (i.e., frame) will have gone through Coding Accuracy Support System (CASS)UE certification to determine which addresses are mailable/deliverable.
5. Mailing package will provide both the Test Census URL and phone number for Census Questionnaire Assistance (CQA).
6. Notice of Visit form will provide both the Test Census URL and phone number for CQA
7. Will use Nonresponse Followup (NRFU) business rules for contact strategies.
8. Will NOT use administrative records and third party data to remove vacant or occupied units.
9. Implement reinterview process
10. In-Office Address Canvassing will occur prior to the UE Operation.
11. Listing and enumeration data collection applications both collect Global Positioning System (GPS) coordinates and metadata to enable Geography Division (GEO) post process
12. Use similar training, procedures and application used by NRFU to complete the “E” in UE.
13. Use similar training, procedure and application used by Address Canvassing for the “U” in UE
14. In UL areas, we will link a questionnaire “code” to the address in the LiMA. Goal is to establish an ID’d response.
15. In UE areas, we will link the Notice of Visit “code” to the address in the LIMA. Goal is to establish an ID’d response.
16. If we encounter Group Quarters (GQs) or Transitory Locations (TLs), they will be classified as such; no attempt will be made to enumerate these unique situations in this test.
17. Use of the dangerous known address database is out of scope for this test.
Nonresponse Followup:
1. We will be using the Contact Strategies from the 2016 Census Test with the following changes:
2. We are not be using Administrative Records in the Puerto Rico test.
3. The ability to add new non-id cases in the Field
Site Selection Decision
Carolina Muncipio, Loíza Municipio, and Trujillo Alto Municipio
Page 7 of 27
Test Focus Operations – Goals, Objectives, Success Criteria, Research Questions
Goals
Research Questions
Success Criteria
Objectives
Identify which are implementation vs research
questions?
(should be quantifiable)
Address Canvassing (Evan Moffett and Karen Owens)
Measure the effectiveness of InOffice Address Canvassing.
Implement In-Office Address Canvassing processes.
Including:
o Interactive Review
o Active Block Resolution (ABR)
o MAF Update
o Identification of the In-Field Address Canvassing
workload
Update the MAF.
Collect production metrics
Make use of resources provided by local
governments and commercial third party
data.
Collect data to directly compare In-Office
to In-Field Address Canvassing results.
Collect data to determine if the
identification of the In-Field Address
Canvassing workload was accurate.
Assess management of Address
Canvassing using the NRFU
approach to alerts.
Refine and expand alerting capabilities related to In-Field
Address Canvassing activities.
Newly developed alerts are deemed
effective. Refinements to existing alerts are
deemed potentially effective.
Page 8 of 27
How accurate are the results from In-Office Address
Canvassing relative to In-Field Address Canvassing?
o Did In-Office Address Canvassing miss housing
units that In-Field Address Canvassing identified?
What types of units were missed (e.g., multi-units,
trailers)?
o Did In-Office Address Canvassing identify housing
units that In-Field Address Canvassing missed?
What types of units?
o Did In-Field Address Canvassing take actions
(adds, deletes, changes, moves) that appear to be
consistent with other sources of data (e.g., DSF
status, GSS-I local file update status)?
Can we identify the kinds of housing situations in
which In-Office Address Canvassing performs as or
more effectively than In-Field Address Canvassing and
vice versa?
Did In-Office Canvassing accurately identify the BCUs
that required In-Field Address Canvassing work?
What are the most effective set of alerts for field
supervisors to stay aware of potential issues in the
Address Canvassing operation?
Test Focus Operations – Goals, Objectives, Success Criteria, Research Questions
Goals
Research Questions
Success Criteria
Objectives
Identify which are implementation vs research
questions?
(should be quantifiable)
Update Enumerate (Evan Moffett and Shawn Hanks)
Integration of listing and
enumeration operations and
systems.
•
Design a data collection operation that leverages
capabilities designed to support Address Canvassing and
NRFU.
•
•
Integration of listing and
enumeration operations and
systems.
•
•
Integration of listing and
enumeration and operations
systems.
Integrate multiple information technology applications
(LiMA, MCM, MOJO, MOCS,field data collection
instrument) to create one seamless operational data
collection, control, and management system.
Deploy a system that can remove cases from the
Enumeration universe if they have responded using an
ID during Self-Response.
Deploy a system that can remove cases from the
Enumeration universe if they have responded through
Non-ID and were matched to an UE case.
Collect operational paradata to inform operational
improvements for the 2018 End to End Test.
•
•
Field work was not affected by system
hand-offs.
Quality of the collected data met or
exceeded that of the 2010 data.
Cases were routed appropriately and the
correct collection instrument was used.
•
Are data able to flow efficiently through the systems?
Were Living Quarters (LQs) enumerated using the
correct instrument?
The appropriate data were passed from
system to system in order to implement
both the listing and enumeration
functions of UE.
Households that self respond were
removed from the Enumeration universe.
•
Are data able to be passed between all systems in
support of a successful UE Operation?
Were cases successfully removed from the
Enumeration Universe if they responded via Self
Response and Non-ID?
Data are summarized to inform field cost
factors for future budget estimation.
•
•
How long did an assignment take to complete?
How long did interviews take to complete?
How much did the operation cost?
Page 9 of 27
Test Focus Operations – Goals, Objectives, Success Criteria, Research Questions
Goals
Research Questions
Success Criteria
Objectives
Identify which are implementation vs research
questions?
(should be quantifiable)
Building on previous test
experiences specific to NRFU
evaluate the impacts on cost and
quality of the UE contact strategy
on enumerator productivity and
efficiency.
Test continued refinements to the
field data collection instrument for
enumeration.
Test modifications and improvements to the field data
collection instrument based on results of the 2016 Census
Test, etc.
Modifications to the data collection
instrument allow effective and efficient field
data collection for UE enumerators.
Allow the collection of data from
the ‘other’ address of in-movers
and whole household usual home
elsewhere cases.
To allow the collection of data from addresses that are not
present in an enumerator’s case list.
UE field staff can effectively collect data from
in-mover and whole household usual home
elsewhere cases, in addition to the collection
of status and enumeration of the original case
address.
Can the collection of this supplemental data be effectively
be built into the pathing of our data collection
instrument?
Collect data associated with enumerator productivity,
efficiency, and case outcomes to enable an assessment
of the UE contact attempt strategy
Collect data on case outcomes to evaluate the
unresolved rate (cases that reach maximum contact
attempts without a successful respondent or proxy
provided enumeration) using the UE contact attempt
strategy.
The 2017 Census Test data analysis of
enumerator efficiency and productivity
parameters – when applied to the 2020
Census cost model – enables an
understanding of the projected UE costs
against cost avoidance targets.
Page 10 of 27
What effects do enhancements to the enumeration
application and an across the UE contact attempt have on
enumerator efficiency and productivity?
Are the proposed changes to the data collection
instrument viable in field enumeration environment?
Test Focus Operations – Goals, Objectives, Success Criteria, Research Questions
Goals
Objectives
Field test continued refinements to
field operational procedures.
Research Questions
Success Criteria
Identify which are implementation vs research
questions?
(should be quantifiable)
Test modifications and improvements to our processes
for streamlining contact procedures for multi-units and
gated communities.
Develop and implement standard field procedures for
handling dangerous addresses.
Field staff can successfully and effectively
contact multi unit and gated communities
while minimizing respondent burden.
A mechanism will exist for field staff to
handle dangerous addresses.
How can we streamline our field contact procedures
for multi-units and gated communities?
Can we operationalize the enumeration process for
potentially dangerous addresses?
Field test continued enhancements
to our field staffing ratios.
Further investigate the staffing ratios for LSO and FMO to
enumerators, etc.
Staffing ratios of enumerators to supervisors
are validated as feasible during field
operations.
What is the most optimal staffing ratio of field
enumerators to supervisors for the UE operation?
Field test refinements to alerts
from the operational control
system
Further refine and expand alerting capabilities.
Newly developed alerts are deemed effective.
Refinements to existing alerts are deemed
potentially effective.
What are the most effective set of alerts for field
supervisors to stay aware of potential issues in the UE
field operation.
We want to reduce the difference found in
the 2015 NCT.
What is the difference in response rates between the two
contact strategies in a test with awareness and through a
more saturated mailout?
Internet Self-Response
(Jason Reese)
Deploy two panels in the SelfResponse areas: Internet Push and
Internet Choice
Continue the testing for Puerto Rico that began in the 2015
National Content Test (NCT) to determine if Internet Push is
viable for 2020.
Deploy the U/E contact strategy in
Puerto Rico
Conduct the testing of the U/E contact strategy in Puerto
Rico and address the different use of Post Office or
community mail drops (prevalent in PR)
What is the self response rate for U/E areas?
Page 11 of 27
Test Focus Operations – Goals, Objectives, Success Criteria, Research Questions
Goals
Objectives
Develop the URL, landing page,
and path navigation for Primus
which is inclusive for Puerto Rico
self respondents to access
Nonresponse Followup
Success Criteria
(should be quantifiable)
Introducing the Internet in Puerto Rico requires
consideration of search and use of the URL as well as
navigation into a path for address data collection (for non-ID
and coverage questions). We will continue to adapt Primus
(different from the NCT Centurion application)
Reduce confusion and calls to TQA.
Research Questions
Identify which are implementation vs research
questions?
Qualitative feedback on our implementation
(Maryann Chapin and Josh Latimore)
Reengineered Field Operations -- Maryann Chapin
Continue refining reengineered
field operations.
Incorporate streamlined contact procedures for multiunits.
We successfully contact multi unit
addresses and minimize the respondent
burden.
Procedures of contacting building
managers first minimized respondent
burden
How can we streamline the contact procedures for
multi-units in order to incorporate the optimization
strategy? (Technical Implementation).
Incorporate procedures and questionnaire
enhancements for situations other than a face-to-face
contact with household member or proxy?
COMPASS question paths guide the user
through special situations with minimal
training required. No more than one day
of in-class training.
What procedures and questionnaire enhancements
need to be in place for situations other than a faceto-face contact with household member or proxy?
These include situations such as apartment labeling
problems, not housing unit situations, refusal
situations, in-mover and out-mover? (Technical
Implementation)
Improve efficiency and
effectiveness of staff and workload
management.
Continue refining reengineered
field operations.
Improve efficiency and
effectiveness of staff and workload
management.
Page 12 of 27
Test Focus Operations – Goals, Objectives, Success Criteria, Research Questions
Goals
Success Criteria
Objectives
(should be quantifiable)
Research Questions
Identify which are implementation vs research
questions?
Lessons learned from the 2015 & 2016
Census Test are incorporated.
Can lessons learned from 2015 and 2016 Census Test
be incorporated? (Technical Implementation).
Improve efficiency and
effectiveness of staff and workload
management.
Lessons learned from 2015 & 2016 Census Test results
including such things as:
a. Addition of validation rules on certain fields, such as
e-mail address, ZIP code, etc.
b. Updating help screens and descriptions.
c. Changing the way certain behind the scenes
variables are set in the course of the interview, like
UNIT STAT, the status assigned to a housing unit.
Continue refining reengineered
field operations.
Incorporate procedures and questionnaire
enhancements for proxy visits
Proxy procedures increase completion
rates and reduce multiple visits to the
same proxy
What procedures and questionnaire enhancements
need to be in place for more efficient proxy
interviews and attempts? (Technical Implementation)
Incorporate further reporting, roll-ups to higher levels
(e.g., AOSC, RCC, HQ), and dashboard capabilities into
MOJO.
Reporting and dashboard capabilities are
incorporated into MOJO.
Can reporting and dashboard capabilities be
incorporated into MOJO?
Further refine and expand alerting capabilities
(production and QC).
Alerting capabilities are expanded and
refined.
Can alerting capabilities be expanded and refined?
Continue refining reengineered
field operations.
Improve efficiency and
effectiveness of staff and workload
management.
Continue refining reengineered
field operations.
Improve efficiency and
effectiveness of staff and workload
management.
Continue refining reengineered
field operations.
Improve efficiency and
effectiveness of staff and workload
management.
Page 13 of 27
Test Focus Operations – Goals, Objectives, Success Criteria, Research Questions
Goals
Success Criteria
Objectives
(should be quantifiable)
Research Questions
Identify which are implementation vs research
questions?
Validate assumptions associated
with key NRFU cost parameters.
To capture additional data points in the 2016 Census
Test that enable an understanding of NRFU cost
parameters such as: late self-response rates (prior to
and after the start of NRFU), NRFU workload completion
rates by contact attempt, enumerator efficiency, staffing
ratios, etc.
Data resulting from the Puerto Rico
Census Test produce data points for key
NRFU cost parameters.
Building on 2015 Census Test
experiences, evaluate the impacts
on cost and quality of a NRFU
contact strategy that allows for a
maximum of six contact attempts
(proxy eligible on the third
attempt) in combination with
added enumeration application
(COMPASS) capabilities and
enhancements (such as handling
noninterviews and proxies) that
could impact enumerator
productivity and efficiency.
Collect data associated with enumerator productivity,
efficiency, and case outcomes to enable an assessment
of a six contact attempt strategy in combination with
enhancement enumeration application capabilities
Collect data on case outcomes to evaluate the
unresolved rate (cases that reach maximum contact
attempts without a successful respondent or proxy
provided enumeration) using a six contact attempt
strategy.
The 2016 Census Test data analysis of
enumerator efficiency and productivity
parameters – when applied to the 2020
Census cost model – enables an
understanding of the projected NRFU
costs against cost avoidance targets.
A reduction in the unresolved rate
(compared, in general, to the 2015
Census Test unresolved rate where the
number of contact attempts varied by
block group).
What impact does an across the board maximum of
six contact attempts have on reducing the unresolved
rate?
What effects do enhancements to the enumeration
application and an across the board maximum of six
contact attempts (proxy eligible on the third attempt)
have on enumerator efficiency and productivity?
UE and NRFU
To allow the collection of data from addresses that are
not present in an enumerator’s case list.
UE field staff can effectively collect data
from in-mover and whole household
usual home elsewhere cases, in addition
to the collection of status and
enumeration of the original case address.
Can the collection of this supplemental data be
effectively be built into the pathing of our data
collection instrument?
Allow the collection of data from
the ‘other’ address of in-movers
and whole household usual home
elsewhere cases.
Page 14 of 27
Test Focus Operations – Goals, Objectives, Success Criteria, Research Questions
Goals
Success Criteria
Objectives
(should be quantifiable)
Research Questions
Identify which are implementation vs research
questions?
Reengineered Quality Control -- RJ Marquette
Test running separate operations
in SMaRCS
SMaRCS can handle multiple field operations
simultaneously without interference
The test is completed with no
corruption/confusion of data from NRFU
and UE within SMaRCS.
Page 15 of 27
Implemenation question – Can SMaRCS safely handle
multiple operations simultaneously?
Support Operations – with Goals, Objectives, Success Criteria and Research Questions
Goals
Sytems Engineering and Integration
Objectives
Success Criteria
Reasearch Questions
(Pete Boudriault)
Gain experience moving to new IT infrastructure
– cloud computing
Gain experience moving to new IT infrastructure
– services
The internet self-response, the real-time
non-id processing and the on-line recruiting,
application and self-assessment systems will
be hosted in a commercial cloud.
Cloud implementation performs as well as
the 2015 DMZ-based solution, and scales as
needed to meet demand.
Can the internet self-response and real-time
non-id processing systems be supported in a
cloud environment?
Sufficient data are collected to inform
Census will obtain metrics regarding cost
subsequent cloud solution implementation
versus performance as a result of the testing
planning (e.g., design, cost estimation, etc.)
to inform decisions regarding future cloud
implementation, our ability to scale nationally
for 2020, etc.
Does the cloud computer platform meet
Census Bureau and Federal IT security
regulations?
Implement fingerprinting-as-a-service.
Is fingerprinting-as-a-service a cost effective
solutions?
Fingerprinting-as-a-service meets all functional
and non-functional requirements.
Does a cloud-based implementation facilitate
scalability in a cost-effective way?
Can fingerprinting-as-a-service integrate
effectively with Census operations and
systems?
Ensure the integration of new systems in to the
Field Test that will support the 2020 Census.
Integrate the following new systems in to the Each new system meets all functional and nonfunctional requirements in support of the Test.
Test:
- CAES
- eCorrespondence
- Enumeration (depending on results of
CEDCaP Analysis of Alternatives)
- CQA
- Pearsis
- CEDSCI?
Page 16 of 27
Support Operations – with Goals, Objectives, Success Criteria and Research Questions
Goals
Field Infrastructure
Objectives
Success Criteria
(Alexa Jones-Puthoff and Shawn Ray)
The Field Infrastructure operation performs the following functions:
Coordinate space acquisition for, and lease management of, the RCC and Area Census Offices.
Provide the administrative infrastructure for data collection covering the 50 states, the District of Columbia, and Puerto Rico including: Recruiting.
o
o
o
o
o
o
o
o
Hiring and onboarding.
Personnel and payroll administration.
Training.
Partnership support.
Management and supervision.
Clerical support.
Materials supply.
Printing and plotting.
Identify applicants from Puerto Rico and provide
the job application and assessment in Spanish.
Test the capability to identify applications as
being from Puerto Rico and provide these
applicants with a Spanish version of the job
application and skills assessment.
Applicants in Puerto Rico are accurately identified
and provided the Spanish Version of the job
application and assessment.
Provide and alternative (to stateside) flow for
applicants who choose to apply using the Spanish
job application and assessment.
Ensure applicants for positions that require
proficiency in both Spanish and English are
provided with the option to complete the English
Proficiency Test.
Applicants for positions that require applicants to
be bi-lingual (English and Spanish) in Puerto Rico
are accurately identified and provided the English
Proficiency Test.
Test the use of a an online application to replace
the paper job application for decennial Area
Operations Support Center positions, including
the Recruiting Assistant, Clerk, Office Operations
Supervisor, Partnership Assistant, Census Field
Supervisor (formerly Local Supervisor of
Operations), and Enumerator.
Determine the extent to which applicants are
successful or need support in completing the
online job application process, including creating
accounts using the identity management solution.
Applicants successfully create user accounts for
CARAT.
Page 17 of 27
Reasearch Questions
Support Operations – with Goals, Objectives, Success Criteria and Research Questions
Goals
Objectives
Success Criteria
Test the use of an online skills assessment to
replace the use of a proctored test for decennial
Area Operations Support Center positions,
including the Recruiting Assistant, Clerk, Office
Operations Supervisor, Partnership Assistant,
Census Field Supervisor (formerly Local
Supervisor of Operations), and Enumerator.
Determine the impact of the online job
application and online skills assessment process
on the jobs of the recruiting staff, especially the
Recruiting Assistants.
Applicants successfully complete the online job
application in English.
Continue to validate the content of the skills
assessments that the Office of Personnel
Management has created to replace the paper
employment test that has been used for several
censuses and census tests this decade.
Utilize the census test environment to determine
priorities for adding additional functionality to the
online job application and assessment process.
Applicants successfully complete the online skills
assessment.
Provide support for job applicants needing
assistance completing the job application and
skills assessment.
Utilize the test environment to learn more about
Information is gathered on the impact of the
supporting job applicants in completing the online online job application and assessment on the jobs
job application and skills assessment.
of recruiting staff, especially the Recruiting
Assistant.
Test the use of identity management process
(process by which applicants will create an
account to being the application process).
Utilize the census test to field test the use of
identity management for job applicants.
Applicants are successfully supported in
completing the online job application process.
Test interfaces between the online job
application and skills assessment components of
CARAT and other systems including DAPPS and
the identity management system.
Utilize the census test to field test and continue
to validate the new skills assessments.
Data, information, and lessons learned are
gathered about the need to support job
applicants in completing the application process
online.
Eliminate paper as a mode of job application
process to the fullest extent possible.
Utilize the census test to field test the interfaces
between CARAT components, DAPPS, and the
identity management process.
Information and lessons learned are gathered to
inform the need to add additional functionality to
support the online job application and
assessment processes
Interfaces between CARAT and other systems are
Page 18 of 27
Reasearch Questions
Support Operations – with Goals, Objectives, Success Criteria and Research Questions
Goals
Objectives
Success Criteria
Reasearch Questions
tested and lessons learned are captured
Determine what additional functionality should
be added to the job application and assessment
process to make it more successful for use in the
2020 Census.
Utilize the census test to help determine the level
of need for a paper job application process.
Information is gathered on the extent to which
the paper job application process was utilized.
Provide a Spanish version of the online job
application and assessment for stateside
applicants.
Test the workflow for applicants who choose to
apply using the Spanish job application and
assessment (including testing the use of the
English Proficiency Test).
Applicants to who choose to apply using the
Spanish skills assessment are successfully routed
to take the English Proficiency Test.
Validate the effectiveness of using Third Party
Vendors for onboarding applicants
Third Party Vendor will interface with the Census
badging system to provide digital photos of
applicants
Third Party Vendor will interface with CHEC for
background investigations
Fingerprint all eligible applicants
Administer the oath of office to all employees
Process applicants background forms and send to
appropriate staff
Effectively receive electronic fingerprints,
application forms, and photographs from Third
Party Vendor
Fingerprints are sent to FBI
Raps Sheets are received from FBI
Who is expected to complete the I-9 Form or where
in the onboarding process should the I-9 be
completed
What forms will Third Party Vendors process
Can Third Party Vendors legally administer the oath
of office
Who determines the applicant suitability
Can Third Party Vendors conduct onboarding
operations in Tribal areas
Test training methods to train enumerators and
LSOs to effectively conduct NRFU
Evaluate the effectiveness of Enumerator and
LSO Online training
Evaluate the effectiveness of Enumerator
Classroom training utilizing LSOs who will be
trained on facilitation skills
Evaluate the effectiveness of LSO Classroom
training utilizing FMOs will be trained on
facilitation skills and subject matter
Implement and evaluate Online Simulated
Enumerators and LSO demonstrate acquisition
and application of knowledge and skills to
perform at desired level.
To what degree enumerators and LSOs react
favorably to the Online and Classroom training
To what degree enumerators and LSOs acquire the
intended knowledge and skills based on their
participation in the Online and Classroom training
To what degree enumerators and LSOs apply what
they learned during training when they are on the
job
To what degree a simulated respondent interview
Information and data are gathered to assist OPM
in further validating the skills assessments.
Page 19 of 27
Support Operations – with Goals, Objectives, Success Criteria and Research Questions
Goals
Objectives
Success Criteria
Respondent
Non-ID Processing
Reasearch Questions
self-assessment tool can predict enumerator on the
job performance
To what degree live case practice builds job
confidence for enumerators and LSOs
(Evan Moffett and Frank McPhillips)
Conduct Real Time Non-ID Processing (RTNP)
Implement real-time address processing
(standardization, MAF matching, geocoding) for
Puerto Rico addresses
Maximize matching of Non-ID respondentprivided addresses to a valid address record in the
Census universe during real time processing
What were the results from real-time matching and
geocoding Puerto Rico addresses during selfresponse?
Conduct post-RTNP automated matching and
geocoding (also known as Asynchronous Non_ID
Processing)
Utilize administrative records data to enhance
respondent-provided address data, and then
make a further attempt to match to a MTdb
record and/or derive a census block geocode.
This will occur on a transactional basis for each
case not matched during RTNP (i.e., individually,
not in batches).
Additional Non-ID responses are matched to
MTdb records and/or assigned to census blocks
Conduct manual Non-ID processing concurrent
with self-response processing
Complete all outstanding stateside manual
processing before the first NRFU and U/E
workload cut, and to keep up with daily
turnaround once NRFU amd U/E starts (this will
simulate a 2020 environment where Non-ID is
reducing field enumeration workload as quickly
as possible).
Manual processing catches up with the backlog of
cases not resolved during automated processing
by the time the initial NRFU cut is taken. This will
give us a measure of how much Non-ID can
reduce the NRFU workload before the operation
even starts.
Conduct office-based address verification (OBAV)
for eligible Non-ID cases
Attempt to verify the existence and census block
location of all eligible addresses from Non-ID
processing using geographic reference sources in
an office-base environment in order to reduce
field verification workload.
Maximize the number of addresses verified
for eligibleNon-ID cases in the office-based
operation
Page 20 of 27
What were the results from Asynchronous NonID Processing (e.g., how many additional
matches and geocodes were derived)?
What additional matches were derived during
manual Non-ID processing that reduced the
NRFU and UE workloads (e.g., how many,
geographic distribution, address characteristics,
etc.)
What additional/updated geocodes were
derived during manual processing?
How many of the Non-ID cases eligible for
address verification could be verified in an
office-based operation?
How does the verification rate for Stateside
addresses during previous Census Tests (20152016) compare to the rate for Puerto Rico
Support Operations – with Goals, Objectives, Success Criteria and Research Questions
Goals
Objectives
Success Criteria
Reasearch Questions
addresses from this test?
Provide workload for Field Verification Operation
Identify the field address verification workload to
be sent out during NRFU. In addition, after the
initial workload is identified, the office-based
address verification operation (OBAV) keeps up
with the daily workload to identify any new field
address verification cases on a case-by-case basis
(e.g., as OBAV staff determines that a case cannot
be resolved, it is reported to MOCS, and the case
can be directed to the field for verification efforts.
Test Multiple Response Validation methodologies
Multiple respondent validation methods utilized
to permit us to evaluate them before the 2018
End-to-End Test. This may include the use of
AdRecs as well as other methods determined by
theNon-ID Response Validation subteam.
Office-based verification keeps up with the
backlog of cases in time to identify the bulk of
Non-ID field address verification work by the time
the initial NRFU workload is established.
How many of the addresses sent to the field were
verified? How many were duplicates of existing
MAF units? How many were deleted (not found)?
How many were incorrectly geocoded during NonID Processing (e.g. were assigned to the wrong
block)?
Respondent validation methods are implemented Do other respondent validation methods provide
during the test at a sufficient scale to compare
similar results to CARRA’s AdRec matching process?
them (e.g., we get enough responses, and enough
of them validated through the respective
Do other methods derive additional
methods).
matches/validation that would fill any “gaps” in the
CARRA process?
Note: We are presenting a DBP and Charter to
PMGB for Non-ID Response Validation on May 4.
Once we get the green light on the scope, etc.,
we can expand on the other methods we will be
utilizing (e.g., fraud detection through IT
solutions, trend analysis during post processing,
etc.) However, it would be premature to state
them until we have an approved charter.
Page 21 of 27
Support Operations Only – Summary of How the Operation will Support the Test
Security, Privacy and Confidentiality
(Pete Boudriault and Rainier Suazo-Munoz)
The Security, Privacy, and Confidentiality operation ensures that all operations and systems used in the 2020 Census adhere to the following policies and regulations:
•
Appropriate systems and data security.
•
•
Respondent and employee privacy and confidentiality.
IRS requirements concerning the use of Title 26 data.
Content and Forms Design (Jenny Kim)
Content and Forms Design is responsible for creating, refining, and finalizing:
• Content specifications for all data collection modes (Internet, paper, Census Questionnaire Assistance, NRFU)
• Paper questionnaire design
• Pretesting of content
• Mailing and field materials
Language Services (Jenny Kim)
The Language Services operation will perform the following activities:
• Assess and support language needs of non- English speaking populations.
• Determine number of non-English languages and level of support.
• Optimize non-English content of questionnaires and nonquestionnaire materials across data collection modes and operations.
• Ensure cultural relevancy and meaningful translation of questionnaires
Page 22 of 27
Support Operations Only – Summary of How the Operation will Support the Test
Decennial Translation Office (Jason Kopp)
•
•
•
•
•
Provide translation support to Field Infrastructure for the training materials, job application (both paper and online), and assessment
Provide translation support to Content and Forms Design and Language Services for questionnaires and mailing materials
Provide translation support to Integrated Partnership and Communications for materials and website
Provide translation support to Response Processing
Provide translation support to Census Questionnaire Assistance (CQA)
Decennial Service Center
(Raphael Corrado, Brian DeVos and Mark Markovic)
The overall goal of the 2020 Census DSC operation is the design and deployment of an integrated service center, which will support field operations and handle all help or service requests initiated by field staff,
including Spanish speaking applicants and staff in Puerto Rico, during the 2020 Census. These services include the following:
•
•
•
•
Password resets for all 2020 Census applications including LUCA.9
Resolution of software and hardware issues from field offices and field staff, such as those experienced by users of the Decennial Applicant Payroll and Personnel System and mobile devices.
Security incident management, such as petty theft, injuries, and stolen equipment.
Communications to and from field offices to address such things as outages or software releases.
Major functions of the DSC include the following:
•
•
•
•
•
•
Provide three major functions supporting 2020 Census Field Operations:
1) Receive requests for service.
2) Respond to requests for service.
3) Report on requests for service.
Provide Tier-1 support during the 2020 Census Tests. Tier-1 support will consist of resolving simple issues from the field in a specified period of time, such as password resets.
Provide Tier-1 and Tier-2 support during the 2020 Census field operations. In addition to the Tier-1 support described above, Tier-2 support will consist of more complex issues requiring troubleshooting by
specially trained staff with expertise in 2020 Census applications, such as MOJO, COMPASS, and Listing and Mapping Instrument.
Provide Implement service-level agreements with Tier-3 support based on current operational standards of practice.
Serve in a coordination and communication role in the event that a field office executes a Continuity of Operations Plan.
Archive electronic records generated by the DSC in accordance with Census Bureau archiving policies.
Page 23 of 27
Support Operations Only – Summary of How the Operation will Support the Test
Decennial Logistics Management
(Alexa Jones-Puthoff and Shawn Ray)
Decennial Logistics Management will provide logistics management services including:
procuring warehouse space,
warehousing,
inventory management,
kit assembly,
deployment of materials, and
receiving and excessing materials.
IT Infrastructure
( Pete Boudriault, Doug Curtner, and Justin McLaughlin)
MDM
SunFlower
Stand up a program-level data repository for administrative records - Gain experience moving to new IT infrastructure – administrative records
Alternative NRFU Laptop Support – To assess laptop alternatives currently existing in the marketplace that could be explored by the agency for use by LSOs.
Provide Fed Ramp certified cloud - The internet self-response (Primus), the real-time non-id processing (RTNP) and on-line recruiting, application and self-assessment (CARAT) systems will be hosted in a
commercial, Fed Ramp, certified cloud.
Implement device as a service - Gain experience moving to new IT infrastructure – services.
Utilize an enterprise development, integration and test environment (EDITE) - Gain experience using an Enterprise shared service.
The self-response data capture systems will support the language options for the Census Test self-response - Gain experience moving to new IT infrastructure.
Page 24 of 27
Support Operations Only – Summary of How the Operation will Support the Test
Geographic Programs
(Evan Moffett and Carrie Butifoker)
The Geographic Programs operation provides the geographic foundation in support of the 2020 Census data collection and tabulation activities within the Master Address File/Topologically Integrated
Geographic Encoding and Referencing (MAF/TIGER) System. The MAF/TIGER System (software applications and databases) serves as the national repository for all of the spatial, geographic, and residential
address data needed for census and survey data collection, data tabulation, data dissemination, geocoding services, and map production.
Components of this operation include:
Geographic Delineations.
Geographic Partnership Programs.
Geographic Data Processing.
Site Selection
Maps
Address files
GRF-C
GRF-N
Forms Printing and Distribution
(Alexa Jones-Puthoff and Mark Matsko)
There are 11 different mailing labels for Puerto Rico.
The Forms Printing and Distribution operation prints and distributes the following paper forms to support the 2020 Census mailing strategy and enumeration of the population:
Internet invitation letters.
Reminder postcards.
Questionnaire mailing packages.
Materials for other special operations, as required.
Page 25 of 27
Support Operations Only – Summary of How the Operation will Support the Test
Paper Data Capture
(Alexa Jones-Puthoff, Mark Matsko, Karen Wyatt-Meyer and Ray Muenzer)
The Paper Data Capture operation captures and converts data from 2020 Census paper questionnaires. This operation includes:
Document preparation.
Scanning.
Optical Character Recognition (OCR).
Optical Mark Recognition (OMR).
Key From Image (KFI).
Editing and checkout.
Integrated Partnership and Communications
Partnership Support – Conduct partnership surges in hard to count tracts
CEM
Statistics In Schools – Develop take home materials for students and parents
Develop and use, culturally appropriate in-language materials (Spanish)
Fully develop website in Spanish
Provide communications support – stakeholder/oversight notification, press releases, events, social media, etc.
Provide recruiting support
Census Questionnaire Assistance
•
•
•
•
•
(Tasha Boone and Mary Bucci)
(Alexa Jones-Puthoff and Kevin Zajac)
For the 2017 Puerto Rico Test, CQA will include:
Contractor-provided contact center infrastructure and staff to handle inbound assistance calls from respondents (no web chat or email capabilities in 2017)
Assistance for respondents completing the Census questionnaire, including capturing responses
Answering questions about Census processes and operations
Interactive Voice Response (IVR) self service solutions to automate certain tasks
Page 26 of 27
Support Operations Only – Summary of How the Operation will Support the Test
Response Processing
(Raphael Corrado and Charles Fowler)
Establish the testing enumeration universes
Manage the enumeration strategies (i.e., contact strategy and followup approach)
Distributes workload files required for enumeration operations
Track enumeration status by case and support determining the course of enumeration based on established business rules.
Perform required response data collection process editing and race/Hispanic origin coding
Perform required post-data collection processing actions in order to prepare the data for for final decennial response file
Perform required steps to create a Census Unedited File for data analysis purposes, including unduplicating data via the primary selection algorithm and performing count imputation.
Serve as the final test data respository for input to required analysis
Page 27 of 27
File Type | application/pdf |
File Modified | 2016-06-21 |
File Created | 2016-05-05 |