2018EndtoEndCensusTest Response Processing Study Plan

2018 RPO OA Study Plan V1.3_PRE (11-3-17)_sent11.14.2017_OMB.docx

2018 End-to-End Census Test – Peak Operations

2018EndtoEndCensusTest Response Processing Study Plan

OMB: 0607-0999

Document [docx]
Download: docx | pdf

Shape1







Study Plan for the 2018 End-to-End Census Test Operational Assessment


Response Processing Operation (RPO) Integrated Project Team (IPT)



Draft Pending Final Census Bureau Executive Review and Clearance.







Version 1.3

November 3, 2017

  1. Introduction


The purpose of the 2020 Census is to conduct a census of population and housing and disseminate the results to the President, to the states, and to the American people. The goal of the 2020 Census is to count everyone once, only once, and in the right place. The challenge is to conduct the 2020 Census at a lower cost per household (adjusted for inflation) than the 2010 Census, while maintaining high-quality results.

The 2018 End-to-End Census Test is an important opportunity for the Census Bureau to ensure an accurate count of the nation’s increasingly diverse and rapidly growing population. It is the first opportunity to apply lessons learned from census tests conducted throughout the decade in preparation for the nation’s once-a-decade population and housing census. The 2018 End-to-End Census Test will have a Census Day of April 1, 2018, and will be conducted in three areas, covering more than 700,000 housing units: Pierce County, Washington; Providence County, Rhode Island; and Bluefield-Beckley-Oak Hill, West Virginia.


The 2018 End-to-End Census Test will be a dress rehearsal for most of the 2020 Census operations, procedures, systems, and field infrastructure to ensure proper integration and conformance with functional and non-functional requirements. The test also will produce a prototype of geographic and data products, and will validate the 2020 Census design and cost estimate. Note that because the 2018 End-to-End Census Test is not being conducted in a “full decennial census environment,” the results may not replicate the results to be obtained in the 2020 Census.


This study plan documents how the Response Processing Operation (RPO) will be assessed as guided by questions to be answered.


  1. Background


This section provides background information on the Response Processing Operation by discussing previously used organizational structures and systems. It also highlights areas of the operation where innovation has occurred for the 2020 Census.

Organizational Structure


During the 2010 Census, some functions now performed by the Response Processing Operation (RPO) were performed by other operations. Creation and management of the enumeration universe was performed by an operation called Universe Control and Management (UCM). The Decennial Management Division (DMD) that managed the 2010 Census was reorganized into the Decennial Census Management Division (DCMD) in preparation for the 2020 Census.



Systems Used


During the 2010 Census, the Census Bureau used the UCM to perform its case management function and the Response Processing System (RPS) for processing response data. During the 2020 Census, functionality previously performed by the UCM and RPS systems will be performed by the Control and Response Data System (CaRDS) for universe creation, the Enterprise Census and Survey Enabling-Operational Control System (ECaSE-OCS) for data collection control and management and the Decennial Response Processing System (DRPS) for post-data collection processing.


2020 Census Innovation Areas


During the 2010 Census, response processing was managed by three separate integrated system teams focused on the following stages: universe creation and management, data capture and integration, and post-processing. With major innovations to the census design in other operations, major opportunities for innovation were created for response processing. For example, the universe of living quarters to be worked in Nonresponse Followup (NRFU) is dynamic in an automated operation. During NRFU production, if responses are received through other modes, such as the internet, cases can be removed from the NRFU case list. Many of these innovations result in expected cost savings for the 2020 Census, as compared to repeating the 2010 Census design. As a result of some of these innovations, it has become necessary to reconsider response processing as one operation that integrates all three of the previous steps described above, rather than as a linear progression of three separate components.


Opportunities to innovate include the following:

  • Use enterprise-developed tools to facilitate intelligent business decisions prior to and during data collection:

  • Interface with all printing systems for production of paper products.

  • Serve as the overall integration “manager” of response data collection, including Internet Self-Response (ISR), Census Questionnaire Assistance (CQA), Paper Data Capture (PDC), and NRFU.

  • Create models based on established business rules to determine the appropriate course of enumeration action for cases (e.g., person visit, use of administrative records and third-party data) and assign each case to the specific mode for data collection.

  • Expand the use of administrative records and third-party data in post-data collection processing activities to support improved data coverage and to reduce the NRFU workload.

  • Expand the use of automated technology, communications monitoring, and improved computational modeling and data analytic techniques to provide early warnings of potentially fraudulent returns.

  1. Assumptions/Design Issues


The sections below list assumptions and design issues pertinent to this assessment of the Response Processing Operation.

Assumptions


  • RPO will use the enterprise-developed system solutions: CaRDS for universe creation and the ECaSE-OCS for data collection control and management.

  • RPO will use the DRPS for post-data collection processing.

  • The Fraud Detection working group will ensure that all self-response data collected is analyzed for fraud.


Design Issues


  • RPO is working with the Population Division (POP) and the Social, Economic, and Housing Statistics Division (SEHSD) to determine processes for data review.

  • RPO is working with Census Disclosure Avoidance Research (CDAR) to determine the process for disclosure avoidance.

  • Details of the interactions between the Evaluation and Experiments Branch (EAE) and RPO during the data collection and post-data collection phases are still being worked out and are not depicted in the Business Process Models (BPMs).


  1. Scope of Assessment Content and Questions To Be Answered


Throughout a three-phase lifecycle, RPO manages 1) the set of all addresses where collection activities will take place, called the enumeration universe, 2) data variables and data standardization 3) the response status of each address, and 4) all answers from respondents, called response data. The three phases are:

  • Pre-data collection

  • Data collection

  • Post-data collection


RPO closely interacts with response data collection operations including:

  • Census Questionnaire Assistance (CQA)

  • Group Quarters (GQ)

  • Internet Self-Response (ISR)

  • Nonresponse Followup (NRFU)

  • Paper Data Capture (PDC)

  • Update Leave (UL)

Pre-Data Collection Phase


During the pre-data collection phase, RPO provides the initial universes to each data collection operation through the use of ECaSE-OCS.


Data Collection Phase


During the data collection phase, RPO uses ECaSE-OCS to provide daily workloads to each of the collection operations. These workloads include additions of newly found housing units (HUs) to the Master Address File (MAF), changes of HUs from one collection operation to another, and collection status of each HU.


Post-Data Collection Phase


During the post-data collection phase, RPO manages all response data that is collected through all of the collection operations. Where multiple responses exist for a single HU, DRPS chooses the best response through a set of criteria implemented in the Primary Selection Algorithm (PSA). DRPS performs edits and imputations on the collected response data. CDAR protects respondent privacy by applying disclosure avoidance algorithms on the response data. The final outputs of RPO’s data collection phase are:

  • Final Collection Master Address File Extract (MAFX)

  • Final Tabulation MAFX

  • Decennial Response File (DRF)

  • Census Unedited File (CUF)

  • Census Edited File (CEF)

  • Microdata Detail File (MDF)


The 2010 RPO Study Plan focused on calculating various types of counts, along with comparing and analyzing these counts. This study plan is focused on investigating a different set of questions about defining the role of RPO as a central coordinator of response data among various operations leading towards the 2018 End-to-End (E2E) Census Test and the 2020 Census. RPO helps coordinate the needs, sensitivities, and constraints of different divisions and operations. As part of this work, RPO helps to solicit and manage requirements, ensure that specifications needed by a development group are completed and reviewed in time to allow development to occur as scheduled, and ensure capabilities are properly built into the system architecture and thoroughly tested.


This Study Plan seeks to answer the following questions:


  1. Were all of the initial workloads (Sample Delivery File (SDF), Mailing/Print workloads, NRFU workloads, GQ, UL, Field Verification (FV), Fraud Detection, Coverage Improvement (CI)) identified and sent successfully? Were workload updates sent successfully?

  2. Were there issues that arose during race and ethnicity coding?

  3. Was the Administrative Record enumeration data received as expected?

  4. Were CI and GQ person-level alternate addresses successfully sent to the matching and geocoding process, and the results successfully linked back to the response data?

  5. Were there issues with processing the post-data collection files (Final Collection MAFX, DRF, CUF, Final Tabulation MAFX, CEF, MDF)?

    • Were there issues with the input files?

    • Were there issues with the output files?

  6. Were all data elements incorporated in the Metadata Registry (MDR)? Were any elements found that should have been included in the MDR?


  1. Methodology


We will answer the study plan questions using quantitative and qualitative measures. This will be achieved through production status meetings, daily production reports, debriefings with systems, data providers and stakeholders, and a lessons-learned meeting with RPO IPT stakeholders.


  1. Were all of the initial workloads (SDF, Mailing/Print workloads, NRFU workloads, GQ, UL, FV, Fraud Detection, CI) identified and sent successfully? Were workload updates sent successfully?


ECaSE-OCS has the responsibility to manage all data collection workloads. This involves assessing response statuses as they come in to determine what steps need to be taken in the data collection process.


In general, we will look at the number of records that were identified in a workload and the number of records that were received. We will determine reasons for any differences with information from production status meetings, daily production reports, and debriefings with systems, data providers and stakeholders.


Specifically, we will look for the following:

  1. If the SDF was not verified, what went wrong?

  2. Upon receiving the mailing/print workload, was the print vendor able to successfully print the questionnaires?

  3. Was the initial NRFU workload successfully identified and made available to the Current Analysis and Estimation System (CAES) in time for Administrative Records modeling?

  4. Were the results of Administrative Records modeling successfully passed back to ECaSE-OCS to then determine how to manage the NRFU workload?

  5. Were sufficient self-response cases correctly removed from the NRFU workload?

  6. Was the FV workload made available to ECaSE-OCS?

  7. Was the CI workload successfully sent to ECaSE-OCS? Was it successfully ingested by the ISR-CI application?



  1. Were there issues that arose during race and ethnicity coding?


Unique codes need to be assigned to write-in entries in order to classify the entries into distinct race and ethnic groups. All race and ethnicity write-ins will be sent to auto-coding, and if no code can be determined, they will be sent to residual coding, where a clerk attempts to assign a code.


To answer this question, we will look at the number of write-in entries, the percent that were auto-coded, the percent that were assigned a code during residual coding, and the percent that were not coded. We expect that production status meetings and debriefings will provide information about coding issues and reasons why entries were not coded.


We will be looking for the following items:

  1. Were there issues generating the coding workload?

  2. Were there issues with certain write-in fields?

  3. Were there issues receiving the coded data?


  1. Was the Administrative Record enumeration data received as expected?


For the first time in this testing cycle, Administrative Record enumeration data will be produced as part of the production flow. The expectation is that these records will be used in place of response records.


We expect to answer this question with information from production status meetings and debriefings with systems, data providers and stakeholders.


We will be looking for the following items:

  1. What was the number of cases sent to PEARSIS for Administrative Record enumeration? Were they all received?

  2. Was the Administrative Record enumeration data produced in time for DRF processing?

  3. Was the Administrative Record enumeration data made available in the expected format?


  1. Were CI and GQ person-level alternate addresses successfully sent to the matching and geocoding process, and the results successfully linked back to the response data?


For the first time in this testing cycle, person-level alternate addresses will be sent to the matching and geocoding process for use in post-processing. Once response records are received, they need to be assessed and identified as to which records have addresses that need to be sent to the matching and geocoding process.


We will show the number of addresses received, the percent sent to matching, the percent able to be matched, the percent sent to geocoding, the percent able to be geocoded, and the percent resolved.


We expect to gain insight about issues through production status meetings and debriefings.


We will be looking for the following items:

  1. Was the universe of person-level alternate address data that needed to be sent to matching and geocoding identified correctly?

  2. Did all person-level addresses that were sent to matching and geocoding have a resolution at the end of operations?

  3. When a person-level address had a resolution, was that result successfully appended to the response data for use in post-processing?


  1. Were there issues with post-data collection files (Final Collection MAFX, DRF, CUF, Final Tabulation MAFX, CEF, MDF)?


At the end of data collection operations, all response data is assessed and tabulated in conjunction with the addresses on the Final Collection MAFX or the Final Tabulation MAFX.


We expect to answer this question through production status meetings and debriefings with systems, data providers and stakeholders.


We will be looking for the following items:

  1. Was the Final Collection MAFX able to successfully process new addresses (such as FV records that were verified in the field)?

    1. How many addresses were on the initial SDF compared to the final MAFX?

  2. Were there data issues that resulted in responses being rejected prior to DRF processing? This could include such things as:

    1. Responses having non-unique identifiers.

    2. Responses having non-valid values.

  3. Were there issues that resulted in the production DRF, CUF, CEF or MDF being rerun?

    1. This could include assumptions about data that were incorrect and not detected during user acceptance testing.

  4. Were Fraud Detection results made available in time for DRF processing? Were results in the expected format?

  5. Was the NRFU Reinterview (RI) Fail File made available in time for DRF processing? Were results in the expected format?

  6. Was the CUF able to add Protected Identification Keys (PIKs) in time to execute the CEF?


  1. Were all data elements incorporated in the MDR? Were any elements found that should have been included in the MDR?


The MDR contains the list of standardized data elements that should be expected from response data output from any data collection instrument.


We expect to answer this question through production status meetings and debriefings with systems, data providers and stakeholders. In 2017, not all expected data elements as defined in the MDR were passed by data collection instruments.


We will be looking for the following items:

  1. Were all MDR items passed as expected?

  2. Were there response elements not defined in the MDR that were passed?


  1. Risks/Limitations


RPO is participating in periodic risk review meetings and tracking five active project-level risks. RPO has developed risk mitigation plans for two program-level risks. As of September 12, 2017, none of the project-level or program-level risks that RPO is tracking are red.


A risk to this assessment is not having access to the data needed to conduct the quantitative portions of the analysis.


  1. Measures of Success


Data Elements Coordination Measures


A Lesson Learned from the 2016 Census Test was that Census Data Elements, (DE) also known as fields or variables, were not well managed. DE names, definitions and usage varied from collection mode to collection mode. To mitigate this deficiency, DCMD initiated a weekly DE meeting series starting on June 7, 2016 to create a shared MDR. This work has led to the creation of a baseline version of the MDR which contains DE names, definitions, and usages that will be used by systems analysts and systems developers to create the systems needed to perform the 2017 Census Test, 2018 E2E Census Test and the 2020 Census.


The RPO IPT will measure success through analysis of DE names and contents in XML response data that returns from the 2018 E2E Census Test. Incoming response data will be compared to results expected in the last baselined MDR.


Workload Management


The RPO IPT will measure success through the use of daily reports, quantitative analysis and working with operational stakeholders to determine daily data collection processes and issues that may arise during data collection.


Coding (Auto and Residual)

The RPO IPT will measure success through the use of daily reports, quantitative analysis and working with stakeholders (POP), to determine the successful completion of coding.


File Creation


The RPO IPT will measure success by monitoring post-data collection reports and working with systems to ensure timely delivery of the files (CUF, CEF, and MDF). RPO will also work alongside stakeholders to ensure files meet the necessary standards and are approved for delivery and distribution.


Adequacy Measures


RPO will measure the adequacy of requirements and specifications by tracking the number and nature of change requests (CRs) submitted to each deliverable after baseline, and factoring in impacts of content or design changes made after this baseline.


  1. Data Requirements


In order to conduct this assessment we will need the following data:

  1. Workload management data from Survey Operations Control System (SOCS).

  2. Identification of records that need race and ethnicity coding.

  3. Identification of records that need Administrative Records enumeration.

  4. Identification of records that need alternate address matching and geocoding.

  5. Access to the post-processing files (Final MAFX, DRF, CUF, CEF, MDF).

  6. Access to initial response data to assess whether MDR elements were fulfilled.


  1. Division Responsibilities


The following divisions will contribute to the assessment of the Response Processing Operation.

Decennial Census Management Division (DCMD)

  • Project Management, document review, providing content to the study plan and assessment report.

  • Reviewing the study plan and reports.

  • Coordinating meetings, statuses, and resolving issues.

Decennial Statistical Studies Division (DSSD)

  • Specifying requirements needed to conduct the 2018 Census Test.

  • Reporting status updates to DCMD.

  • Conducting quantitative analysis as needed.

Census Enterprise Data Collection and Processing Program Management Office (CEDCaP PMO)

  • Specifying requirements needed to conduct the 2018 Census Test.

  • Reporting status updates to DCMD.

Population Division (POP) and Social, Economic, and Housing Statistics Division (SEHSD)

  • Specifying requirements needed to conduct the 2018 Census Test.

  • Reporting status updates to DCMD.

Center for Disclosure Avoidance Research (CDAR)

  • Specifying requirements needed to conduct the 2018 Census Test and apply disclosure avoidance.

  • Reporting status updates to DCMD.

Decennial Information Technology Division (DITD)

  • Providing systems necessary to conduct the 2018 Census Test.

  • Reporting status updates to DCMD.

  1. Milestone Schedule


Activity ID

Activity Name

Orig Duration

Start

Finish

Response Processing Operation (RPO) Assessment Study Plan

First Draft






Prepare First Draft of Response Processing Operation (RPO) Assessment Study Plan

5

06/06/17

06/11/17


Distribute First Draft of Response Processing Operation (RPO) Assessment Study Plan to the Assessment Sponsoring DCMD ADC and Other Reviewers

1

06/12/17

06/12/17


Incorporate DCMD ADC and Other Comments to Response Processing Operation (RPO) Assessment Study Plan

5

06/13/17

06/19/17

Initial Draft






Prepare Initial Draft Response Processing Operation (RPO) Assessment Study Plan

5

06/20/17

06/26/17


Distribute Initial Draft Response Processing Operation (RPO) Assessment Study Plan to Evaluations & Experiments Coordination Branch (EXC)

1

06/27/17

06/27/17


EXC Distributes Initial Draft Response Processing Operation (RPO) Assessment Study Plan to the DROM Working Group for Electronic Review

1

06/28/17

06/28/17


Receive Comments from the DROM Working Group on the Initial Draft Response Processing Operation (RPO) Assessment Study Plan

5

06/29/17

07/5/17


Schedule the RPO Study Plan for the IPT Lead to Meet with the DROM Working Group


1

07/06/17

07/06/17


Discuss DROM Comments on Initial Draft Response Processing Operation (RPO) Assessment Study Plan

10

07/07/17

07/20/17

Final Draft






Prepare Final Draft of Response Processing Operation (RPO) Assessment Study Plan

15

07/21/17

8/10/17


Distribute Final Draft of Response Processing Operation (RPO) Assessment Study Plan to the DPMO and the EXC

1

08/11/17

08/11/17


Discuss Final Draft Response Processing Operation (RPO) Assessment Study Plan with the 2020 PMGB

10

08/12/17

08/25/17


Incorporate 2020 PMGB Comments for Response Processing Operation (RPO) Assessment Study Plan

10

08/26/17

09/8/17


Prepare FINAL Programs Response Processing Operation (RPO) Assessment Study Plan

10

09/11/17

09/19/17


Distribute FINAL Response Processing Operation (RPO) Assessment Study Plan to the EXC

1

11/2/17

11/2/17


DCCO Staff Process the Draft 2020 Memorandum and the RPO Study Plan to Obtain Clearances (DCMD Chief, Assistant Director, and Associate Director)

10

11/3/17

11/13/17


DCCO Staff Formally Release the RPO Study Plan in the 2020 Memorandum Series


1

11/14/17

11/14/17

Response Processing Operation (RPO) Assessment Report

First Draft of Assessment Report


Receive, Verify, and Validate RPO Data

20

3/23/18

4/12/18


Examine Results and Conduct Analysis

20

4/13/18

5/3/18


Prepare First Draft of RPO Report

15

5/4/18

5/19/18


Distribute First Draft of RPO Report to the Assessment Sponsoring DCMD ADC and Other Reviewers

1

5/20/18

5/20/18


Incorporate DCMD ADC and Other Comments RPO Report

5

5/21/18

5/26/18

Initial Draft of Assessment Report


Prepare Initial Draft Response Processing Operation (RPO) Assessment Report


10

5/27/18

6/6/18


Distribute Initial Draft Response Processing Operation (RPO) Assessment Report to Evaluations & Experiments Coordination Br. (EXC)


1

6/7/18

6/7/18


EXC Distributes Initial Draft Response Processing Operation (RPO) Assessment Report to the DROM Working Group for Electronic Review


1

6/8/18

6/8/18


Receive Comments from the DROM Working Group on the Initial Draft Response Processing Operation (RPO) Assessment Report


10

6/9/18

6/19/18


Schedule the RPO Report for the IPT Lead to Meet with the DROM Working Group

1

6/20/18

6/20/18


Discuss DROM Comments on Initial Draft Response Processing Operation (RPO) Assessment Report


11

6/21/18

7/2/18

Final Draft of Assessment Report


Prepare Final Draft of Response Processing Operation (RPO) Assessment Report


25

7/3/18

7/28/18


Distribute Final Draft of Response Processing Operation (RPO) Assessment Report to the DPMO and the EXC


1

7/29/18

7/29/18


Discuss Final Draft Response Processing Operation (RPO) Assessment Report with the 2020 PMGB


10

7/30/18

/8/9/18


Incorporate 2020 PMGB Comments for Response Processing Operation (RPO) Assessment Report




10

8/10/18

8/20/18

Final Assessment Report


Prepare FINAL Response Processing Operation (RPO) Assessment Report


10

8/21/18

8/31/18


Deliver FINAL Response Processing Operation (RPO) Assessment Report to the EXC


1

9/1/18

9/1/18


EXC Staff Distribute the FINAL Response Processing Operation (RPO) Report and 2020 Memorandum to the DCCO


3

9/2/18

9/5/18


DCCO Staff Process the Draft 2020 Memorandum and the FINAL Response Processing Operation (RPO) Report to Obtain Clearances (DCMD Chief, Assistant Director, and Associate Director)


30

9/6/18

10/9/18


DCCO Staff Formally Release the FINAL Response Processing Operation (RPO) Report in the 2020 Memorandum Series

1

10/10/18

10/10/18


EXC Staff Capture Recommendations of the FINAL Response Processing Operation (RPO) Report in the Census Knowledge Management SharePoint Application

1

10/11/18

10/11/18


  1. Review/Approval Table


Role

Electronic Signature

Date

Fact Checker or independent verifier



Author’s Division Chief (or designee)



DCMD ADC



DROM DCMD co-executive sponsor (or designee)



DROM DSSD co-executive sponsor (or designee)



Associate Director for R&M (or designee)



Associate Director for Decennial Census Programs (or designee)





  1. Document Revision and Version Control History


VERSION/EDITOR

DATE

REVISION DESCRIPTION

EAE IPT CHAIR APPROVAL

V0.01/Charles DeRosa

12/20/2016

Initial draft


V0.02/Charles DeRosa

3/7/2017

Second draft


V0.03/Charles DeRosa

3/23/2017

Third draft, added material on study focus and purpose, added previous study plan as a reference


V0.04/Charles DeRosa

4/3/2017

Fourth draft, accepted changes from v0.03. Added new material


V0.05 Charles DeRosa

4/6/2017

Re-ordered and numbered study plan questions, defined the RPO methodology, risks, measures of success, and data requirements


V0.06 Charles DeRosa

5/4/2017

Added a section titled “Data Elements Coordination Measures” to Section VII titled “Measures of Success”



V0.07 Charles DeRosa

5/16/2017

Incorporated Miranda Chung’s comments sent


V0.08 Tom Thornton

6/12/2017

Incorporated RPO IPT comments sent


V 1.0 Juan Morales

9/19/2017

Incorporated DROM comments


V 1.1 Ryan King

10/27/2017

Incorporated updated methodology following Pat Cantwell’s rejection of V 1.0.


V 1.2 Juan Morales

11/2/2017

Updated schedule section


V 1.3 Mary Frances Zelenak

11/3/2017

Reviewed and addressed minor formatting and consistency issues.




  1. Glossary of Acronyms


Acronym

Definition

ADC

Assistant Division Chief

BPMs

Business Process Models

CAES

Current Analysis and Estimation System

CaRDS

Control and Response Data System

CDAR

Center for Disclosure Avoidance Research

CEDCaP

Census Enterprise Data Collection and Processing

CEF

Census Edited File

CI

Coverage Improvement

CQA

Census Questionnaire Assistance

CRs

Change Requests

CUF

Census Unedited File

DCCO

Decennial Census Communications Office

DCMD

Decennial Census Management Division

DE

Data Element

DITD

Decennial Information Technology Division

DMD

Decennial Management Division

DPMO

Decennial Program Management Office

DRF

Decennial Response File

DROM

Decennial Research Objectives and Methods Working Group

DRPS

Decennial Response Processing File

DSSD

Decennial Statistical Studies Division

E2E

End-To-End

EAE

Evaluation and Experiments

ECaSE-OCS

Enterprise Census and Survey Enabling - Operational Control System

EXC

Evaluations & Experiments Coordination Branch

FV

Field Verification

GQ

Group Quarters

HU

Housing Unit

IPT

Integrated Project Team

ISR

Internet Self-Response

MAF

Master Address File

MAFX

Master Address File Extract

MDF

Microdata Detail File

MDR

Metadata Registry

NRFU

Nonresposne Followup

PDC

Paper Data Capture

PMGB

Portfolio Management Governance Board

POP

Population Division

PSA

Primary Selection Algorithm

R&M

Research & Methodology Directorate

RI

Reinterview

RPO

Response Processing Operation

RPS

Response Processing System

SDF

Sample Delivery File

SEHSD

Social, Economic, and Housing Statistics Division

SOCS

Survey Operations Control System

UCM

Universe Control Management

UL

Update Leave

UE

Update Enumerate


  1. Reference

U.S. Census Bureau (2011), “2010 Census Universe Control and Management/Response Processing System Assessment Study Plan,” 2010 Census Planning Memoranda Series

No. 128, March 9, 2011.




21

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title2018 E2E CT Assessment Study Plan for Response Processing
Authordouglass Abramson
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy