PROGRAM
FOR THE INTERNATIONAL
ASSESSMENT OF ADULT COMPETENCIES (PIAAC)
2013-2014
NATIONAL SUPPLEMENT DATA COLLECTION
REQUEST FOR OMB CLEARANCE
OMB# 1850-0870 v.3
Supporting Statement Part B
Prepared by:
National Center for Education Statistics
U.S. Department of Education
Washington, DC
February 12, 2013
Collection of Information
|
B |
The PIAAC National Supplement target population is not a nationally representative sample of the entire population but rather is intended as a nationally representative sample of specific subgroups of interest. These subgroups consist of (a) the 66- to 74-year-old general population, (b) adults in prison (ages 16-74), (c) unemployed adults (ages 16-65), and (d) young adults (ages 16-34) who are either employed or not in the labor force. To reflect persons in these specific subgroups, two sampling efforts will be deployed: one to select individuals for a household-based sample and the other to select individuals for a prison sample.
Household-based sample
The target population for the National Supplement’s household-based sample consists of non-institutionalized adults, 16 to 74 years old1, who reside in the United States at the time of interview, excluding adults 35-65 who are either employed or not in the labor force. Adults are to be included regardless of citizenship, nationality, or language. The household-based target population includes only persons living in households or non-institutional group quarters; it excludes all other persons (such as persons living in shelters, prisons, military personnel who live in barracks or bases, or persons who live in institutionalized group quarters, such as hospitals or nursing homes). The household-based target population includes full-time and part-time members of the military who do not reside in military barracks or military bases; adults in other non-institutional collective dwelling units, such as workers’ quarters or halfway homes; and adults living at school in student group quarters, such as a dormitories, fraternities, or sororities. Persons who are temporarily in the country may be eligible depending upon how long they have been in the country. Adults who are unable to complete the assessment because of a hearing impairment, blindness/visual impairment, or physical disability are in-scope; however because the assessment does not offer accommodations for physical disabilities, they are excluded from response rate computations.
Because the PIAAC National Supplement is meant to augment the PIAAC Main Study with the additional sample of the specific subgroups, the National Supplement will use the same primary sampling units (PSUs), which are counties or groups of counties, and the same secondary sampling units (SSUs or segments), which are groups of Census blocks, that were used for the Main Study. Thus, the design for the National Supplement household sample is a four-stage, stratified area probability sample that originates in the PSUs and segments. This is an area-clustered sample design, which is the most efficient design for the household-based sample because the survey involves in-person interviews and assessments. (Clustered samples minimize the amount of interviewer travel, and thus reduce the cost of the survey.) It is also a stratified sample design2 to ensure the resulting sample is representative in terms of characteristics related to adult literacy and competency such as education, age, gender, income, and geographic location.
As in the PIAAC Main Study, the National Supplement’s segments consist of at least 60 dwelling units (DUs) in area blocks (as defined by the 2000 decennial census) or combinations of two or more nearby blocks. Within each PSU, the segment sampling frame was formed using the block data from the Census Summary File (SF1) sorted by tract, block group, and block number before creating the segments. Blocks with no DUs and no population were included on the frame so that all areas, some of which contained DUs constructed after the 2000 Census, were involved in the formation process. Once segments were formed, the number of DUs in each segment were compared with counts of residential addresses from the November 2010 Computerized Delivery Sequence File from the United States Postal Service (USPS).
Within the segments, a sample of DUs not selected for the PIAAC Main Study sample will be selected from the already existing DU listings comprising the PIAAC Main Study DU sample frame. The fourth stage of selection is a sample of eligible persons within DUs. The result will be a sample directly linked to the PIAAC Main Study at the PSU and segment level. Therefore, in terms of the sampling frame for the National Supplement, new sampling frames are not necessary for PSUs and SSUs since the same PSUs and SSUs selected for PIAAC Main Study will be used for the National Supplement. A frame of persons will need to be created within each DU selected for the National Supplement. For each selected DU, a screener interview will be used to identify the eligible persons. A sampling algorithm will be implemented within the Computer-Assisted Person Interview (CAPI) system to select persons among those identified to be eligible.
Prison sample
The target population for the National Supplement’s prison sample is inmates 16 – 74 from eligible state, federal, and private prisons in the United States. A stratified two-stage sample will be used to select inmates. At the first stage, 100 prisons will be selected from the frame, with probabilities proportionate to a measure of size (MOS). At the second sampling stage, a sample of 15 inmates on average will be selected from the sampled facilities. This design is similar to that used in the successful implementation of the 2003 National Assessment of Adult Literacy (NAAL) prison study.
Once individual persons are selected, both in the household-based and the prison sample, the background questionnaire (BQ) interview is to be completed. Upon completion of the BQ, the selected person will answer the Core Task items. If the respondent passes the Core Task, the respondent will be provided a computer-based assessment. Those who do not pass the Core Task will be given a paper-and-pencil assessment booklet.
Sample Sizes and Response Rates - Household-based Sample
To ensure that the National Supplement household-based target number of completed cases (3,600) is achieved, the initial sample size must account for (a) ineligibility (i.e., households without an unemployed person 16 to 65 years old, a 66-74 year old adult, a 16-34-year-old person not unemployed, or vacant dwelling units), (b) screener nonresponse, (c) within-household selection rates, and (d) nonresponse to the BQ and the assessment. Specific within-household selection rates are necessary to arrive at target sample sizes (not too many or too few) for the key subgroups mentioned above. We expect the response rates to be similar to those experienced in PIAAC Main Study conducted during 2011-2012. If the actual response rates do not meet the NCES standards for response rate goals, a nonresponse bias analysis will be conducted at each stage of data collection that do not meet the standards (see Appendix F for preliminary plans for conducting these analyses). Given the results of the PIAAC Main Study, we expect the overall response rate to be 70 percent if the incentive level is $50, which is at the same level as the PIAAC Main Study in 2011-2012.
The occupancy rate is expected to be about 85 percent; similar to what was experienced in the PIAAC Main Study. The drop between the number of completed screeners (12,483) and the number of attempted BQs (4,448) is due to households that do not have at least one unemployed person 16 to 65 years old, a 66-74 year old person (regardless of employment status), or a 16-34 year old person not unemployed, and due to the selection rates being applied for each sampling group within a household. The screener and BQ response rates are consistent with the corresponding weighted response rates for the PIAAC Main Study sample. The assessment response rate is assumed to be similar to the PIAAC Main Study rate. Overall, a 70 percent response rate is expected. Table 4 provides a summary of the sample sizes and the response rate assumptions at each sampling stage for the household-based sample.
Survey and sampling stages |
Eligibility and
|
Projected rates |
Sample yield |
Number of selected PSUs |
|
|
80 |
Number of selected segments |
|
|
896 |
Number of selected dwelling units |
Occupied dwelling unit rate |
85.0% |
16,978 |
|
Screener response rate |
86.5% |
|
|
|
|
|
Number of completed screeners |
Eligibility rate |
35.63% |
12,483 |
|
|
|
|
Number of attempted BQs |
BQ response rate |
82.5% |
4,448 |
|
|
|
|
Number of persons with completed BQs |
Assessment completion rate |
98.1% |
3,670 |
|
|
|
|
Number of completed or partially completed assessments |
|
|
3,600 |
1 The screener, BQ and assessment response rates and occupied dwelling unit rate were determined based on the PIAAC Main Study experience.
Sample Sizes and Response Rates - Prison Sample
To ensure that the National Supplement prison sample target number of completed assessments (1,200) can be achieved, the initial prison sample size must account for (a) ineligibility, (b) prison nonresponse, and (c) inmate nonresponse. We expect the response rates to be similar to those experienced in the 2003 NAAL inmate sample. Like the household-based sample, if the actual response rates do not meet the NCES standards for response rate goals, a nonresponse bias analysis will be conducted at each stage of data collection that do not meet the standards. Given the results of the 2003 NAAL inmate sample, we expect the overall response rate to be about 85 percent. Table 5 provides a summary of the sample sizes and the response rate assumptions at each sampling stage for the prison sample.
Survey and sampling stages |
Eligibility and
|
Projected rates |
Sample yield |
Number of selected prisons |
Prison response rate |
96% |
100 |
|
Eligibility rate |
97% |
|
|
|
|
|
Number of participating prisons |
Average number of inmates selected per prison |
14.63 |
93 |
|
|
|
|
Number of attempted BQs |
BQ response rate |
90% |
1,361 |
|
|
|
|
Number of persons with completed BQs |
Assessment completion rate |
98% |
1,224 |
|
|
|
|
Number of completed or partially completed assessments |
|
|
1,200 |
1 The eligibility and response rates are consistent with the 2003 NAAL prison sample.
This section describes the sample design for the PIAAC National Supplement. A multi-stage design will be employed for the National Supplement, and the sample selection approach is described for each sampling stage.
Statistical Methodology – Household-based Sample
As stated above, the same PSUs and segments selected in the first two stages of sampling for the PIAAC Main Study will be used for the National Supplement. The third stage of sampling for the PIAAC National Supplement household-based sample will involve sampling DUs from listings of addresses in each selected segment and will involve an initial sample of about 16,978 DUs from the frame of addresses in each selected segment in order to arrive at 3,600 completed assessments. All DUs within each selected segment were listed by trained Westat listers for the PIAAC Main Study. Given the actual number of listed DUs and derived sampling rates for each segment, dwelling units will be selected from the listing sheets at the Westat home office.
The fourth stage of selection involves listing the age-eligible household members (aged 16 to 74) for each selected dwelling unit during the screener interview. The enumeration and selection of persons will be performed using the CAPI system, which will collect information via the screener instrument, including age and gender of persons in the dwelling unit. The employment status of each age-eligible person will then be determined from a short series of questions. Prior to selection, the individuals will be stratified into the following three groups: (Group 1) unemployed, 16-65 years old; (Group 2) not unemployed, 16-34 years old; and (Group 3) 66-74 years old regardless of employment status. Subsequently, households without an individual in one of the above groups will ‘screen out’ based on employment status and age. Selection rates will be assigned for each group such that the overall target sample sizes may be achieved. For Group 1, all persons (up to 4) will be selected. For Group 2, a predetermined rate will be applied to determine if any selection will occur; if so, a second rate will be applied to select 1 person. For Group 3, a predetermined rate will be applied to determine if any selection will occur; if so, a second rate will be applied to select up to 2 persons 66-74 years old.
Household members who are away in college (staying at college dormitories) will be considered to be part of their family’s household. If it is not possible to reach the students at the family homes during the data collection period, an interview will be arranged with them at college, if they reside within or adjacent to one of the 80 PSUs. Westat successfully applied the same procedure for the PIAAC Main Study.
Statistical Methodology – Prison Sample
The target population for the National Supplement prison sample is inmates 16 – 74 from eligible state, federal, and private prisons in the United States. To arrive at a minimum of 1,200 completed cases, a two-stage sample will be used to select inmates. The first-stage sampling units (or PSUs) will be state or federal adult correctional facilities. At the first stage, 100 prisons will be selected from the frame, with probabilities proportionate to a measure of size (MOS). At the second sampling stage, a sample of 15 inmates on average will be selected from the sampled facilities. Unlike the household-based component of the National Supplement, persons selected for the inmate sample will not be paid a monetary incentive.
The prison sample sampling frame will be created in a manner similar to that for the 2003 NAAL prison sample primarily from two data sources: the most recent Bureau of Justice Statistics Census of State and Federal Adult Correctional Facilities (referred to in the following text as the Prison Census) and the most recent Directory of Adult and Juvenile Correctional Departments, Institutions, Agencies, and Probation and Parole Authorities available from the American Correctional Association (ACA).
The most recent Prison Census was conducted in 2005. The facility universe for that census was developed from the Census of State and Federal Adult Correctional Facilities conducted in 2000. The facility universe for the PIAAC supplement is consistent with the Prison Census. As defined for the Prison Census, the target population includes the following types of state and federal adult correctional facilities: prisons; prison farms; reception, diagnostic, and classification centers; facilities primarily for parole violators and other persons returned to custody; road camps, forestry and conservation camps; youthful offender facilities (except in California); vocational training facilities; drug and alcohol treatment facilities; and state-operated local detention facilities in Alaska, Connecticut, Delaware, Hawaii, Rhode Island, and Vermont. Facilities were included in the enumeration if they were: (1) staffed with federal, state, local, or private employees; (2) held inmates primarily for state or federal authorities; (3) were physically, functionally, and administratively separate from other facilities; and (4) were operational on December 30, 2005.
The 2005 Prison Census excluded the following types of institutions:
Private facilities not primarily for state or federal inmates
Military facilities
Immigration and Customs Enforcement (ICE) facilities
Bureau of Indian Affairs facilities
Facilities operated by or for local government, including those housing state prisoners
Facilities operated by the United States Marshals Service
Hospital wings and wards reserved for state prisoners
Facilities that hold only juveniles
Even though they contain inmates up to age 21, juvenile facilities will be excluded from the PIAAC National Supplement prison sample for two reasons: (1) to remain consistent with the facilities listed in the 2005 Prison Census and (2) to promote cost efficiency because it will not be cost effective to visit these facilities to sample the small number of inmates 16 years of age and older.
The 2012 ACA directory contains an updated list of adult and juvenile state correctional departments, institutions, programs, and probation and parole/aftercare services. The directory also includes updated inmate population figures, security level, and gender of the inmates, which were all helpful for sample design purposes.
The Prison Census list of facilities will be compared with the ACA directory list to arrive at a sampling frame of prisons eligible for the study. After comparing the ACA and Prison Census information, some cases may require clarification as to their eligibility status such as those facilities in Illinois. Before sample selection, further work may be necessary to separate work camps, annexes, satellites, and boot camps from their main facility.
After the prison sampling frame is complete, prisons will be stratified by whether the facility houses female inmates to ensure that analyses may be performed by gender. Within strata, the facilities will be further ordered by the following characteristics: census region, security level (supermaximum/maximum, medium, minimum, or other), type (federal, state, or private), and the number of inmates in the facility.
The second-stage units will consist of inmates selected within a sampled prison. Inmates will be selected with a probability inversely proportional to the prison’s population size so that the product of the first- and second-stage selection probabilities will be constant. While this sample design is intended to provide a constant overall probability of selection across all inmates, in practice, the number of sampled inmates will vary within prisons because of differences between the anticipated and actual sizes of the inmate populations and also because of constraints on the sample size per prison.
Inmate sampling frames will be created by interviewers at the time they visit the prisons. The frame will consist of all inmates occupying a bed the night before inmate sampling is conducted.
Estimation
For the PIAAC National Supplement, sampling weights will be produced to facilitate the estimation of the target population parameters. Replicate weights will be computed to facilitate variance estimation, and will capture the variation due to the sample design and selection, as well as weighting adjustments.
The estimation procedures for the PIAAC Main Study data were prescribed by and were the responsibility of the international sponsoring agency, however, the United States has reviewed and agrees with these procedures. For the PIAAC National Supplement we will comply with these same procedures and policies by delivering masked data (note that a disclosure analysis will be conducted prior to submitting the data to the international contractor so as to comply with current federal law), and documentation of sampling and weighting variables. All data delivered will be devoid of any data that could lead to the identification of individuals.
In order to meet the PIAAC response rate goals, NCES will rely on procedures and approaches that have been used successfully in the past in household studies. Building good response rates begins with hiring field staff with the experience and skills that will make them successful in convincing people to cooperate, and training them how to not only administer the instrument and follow the study procedures, but also how to convince respondents to participate.
Maximizing Response Rates – Household-based Sample
NCES views gaining respondent cooperation as an integral part of a successful data collection effort and will invest the resources necessary to ensure that the procedures are well developed and implemented. We will use an advance contact strategy that has been successfully employed on many large-scale, in-person household studies. An advance letter will be mailed to all households selected for the household-based sample in advance of the data collector’s initial visit. This letter will inform potential respondents of NCES authorizing legislation; the purposes for which the PIAAC data are needed; uses that may be made of the data; and the methods of reporting the data to protect privacy. In addition, an informative brochure (provided in Appendix C) will be given to sampled participants when the interviewer visits the sampled household. All project materials will include the study’s web site address and a toll-free telephone number for respondents to obtain additional information about the study. The materials will also mention the respondent incentive and will include the study logo for legitimacy purposes. It is very important for the data collector to establish legitimacy at the door, which can be accomplished by the use of a strong introductory statement during which the data collector shows their ID badge and a copy of the advance materials.
In addition to the advance contact strategy above, we will add a pre-field activity, not used during the PIAAC Main Study, to identify screening challenges and solutions prior to the start of data collection. Dwelling units (DUs) selected for the PIAAC National Supplement will be selected from the PIAAC Main Study PSUs and DU listings. Well in advance of the start of interviewing, Westat home office staff will review all information available from the PIAAC Main Study about the DUs selected for the National Supplement. This will include a review of: (1) the lister notes on the Segment Profile Forms in all PIAAC segments and (2) information captured in the PIAAC Main Study Survey Management System (SMS) entered by interviewers as they worked the Main Study cases. From the former, we will obtain general information on segments with unusual characteristics such as those including locked buildings, gated communities, high-income respondents, high-crime areas, or those including language minority populations. The SMS will provide us with the information recorded by the interviewers in the electronic record of contacts (EROCs). Together, this information will help us develop new strategies for dealing with the challenging segments and DUs, as well as, allow us to reuse old strategies that worked during the PIAAC Main Study. For example, depending on the special segment characteristic or SMS information, we will focus on hiring interviewers with the experience and/or language skills required, conduct an early screening effort via surface mail, obtain telephone numbers for selected addresses, and contact building contacts in a certain locked building who helped us gain access to the building during the PIAAC Main Study.
Once data collection begins, effective contact patterns are another important component of achieving response rates. Completion rates improve when data collectors attempt contact on different days of the week and at varying times of the day. Data collectors will make four well-timed attempts to contact a household before reviewing the case with the supervisor to identify another pattern of contact. These other contact strategies may include telephone, FedEx letters, or leaving messages with neighbors. We plan to staff each PSU with two data collectors. It is advantageous to have multiple data collectors in a PSU as it allows better matching between data collectors and respondents and allows for coverage in case of data collector illness or unavailability. In carrying out efforts to achieve high response and participation rates, we propose to organize our data collection efforts using a phased approach that allows for refusal conversion.
Each data collector will receive a laptop computer loaded with the Interviewer Management System (IMS). This system allows data collectors to launch all CAPI instruments and permits tracking of their work and time. Data collectors will use the electronic record of call (EROC) feature of the IMS to collect information about each visit to a household that did not result in a completed interview. EROC information will include: contact date and time, contact result or disposition code, appointment information, and general data collector comments. The EROC data are very helpful in documenting the results of contact attempts for nonresponding households, and in helping to design a more directed and effective campaign to convert the nonresponding households. All nonresponse followup and refusal conversion efforts also will be tracked and documented in the IMS.
Whenever a refusal or breakoff is encountered, the data collector will complete an automated noninterview report (NIR) that captures information about the reason for refusal. Automated EROC and NIR information is available to the supervisors via data transmission to the home office by the data collectors and subsequent transmissions to the supervisors. Contact and decline information will be collected, coded, and included in the biweekly data collection progress report. NCES believes that frequent, open communication between all levels of field staff is required for a successful data collection effort. Supervisors will primarily use email for day-to-day communication with their staff. Scheduled weekly conference calls will also be used at all levels. All supervisory staff will be available for questions or other issues that come up every day via telephone and email. Other activities that will be considered to increase response rates are:
Enhance interviewer training on screening. Dedicate more training time, in the home study package and during initial interviewer training, to “the importance of obtaining high screener completion rates and tips on completing screeners.” Continue to focus on this throughout data collection via supervisor/interviewer conference calls. Finally, hold special conference call training sessions, as necessary during data collection, to focus on this activity.
Design an interviewer incentive program that includes a reward for closing screeners. For most of the PIAAC Main Study, Westat had an interviewer incentive program that rewarded completes. Towards the end of PIAAC Main Study, we added an interviewer incentive for screening that proved very successful in closing more difficult screeners and those that probably would not yield eligible sample persons. For the National Supplement, we would plan to have an incentive program for both screening and interviewing/assessment work from the first day of field work.
Maximizing Response Rates – Prison Sample
The permission and cooperation of federal, state, and correctional facility officials will be required before sampling and interviewing within prisons can begin. Support of representatives from the Bureau of Justice Statistics and from the Office of Vocational and Adult Education within the U.S. Department of Education will be recruited to provide assistance in gaining cooperation from federal and state correctional agency officials. Letters of endorsement will be obtained from the Correctional Education Association and the American Correctional Association.
To gain cooperation at the sampled facilities a multilevel approach will be implemented. Initially, letters will be mailed to officials at the Federal Bureau of Prisons and the correctional agencies of all states in which prisons had been selected for the study which will explain the study and ask for permission to contact selected facilities within the agency’s jurisdiction. Letters will be followed up with telephone calls to answer questions, secure cooperation, and determine prison contact procedures. Based on our experience with NAAL in 2003, some prisons require approval of the instruments and procedures by individual institutional review boards (IRB). If prison IRB approval is required, materials included in this OMB submission and the related appendixes will be provided to the IRB as needed in response to their questions.
Once approval is received at the state or federal level, we will request the approving official to inform the warden at the sampled facility that the facility has been selected and urge the facility to participate. The data collection contractor will then contact the facility and the contractor’s prison negotiator will provide additional information about the study and described the sample selection process. The warden will be asked to designate a prison official to serve as coordinator for the study to work out details, such as the interviewing procedures within the facility. Prison coordinators will be asked to arrange a secure, private room for each interview to ensure confidentially during the interview. To minimize misinformation and deter refusals, facilities will be requested to “call out” selected inmates without providing an explanation of the study. The PIAAC trained interviewers will be responsible for introducing the study and gaining inmate cooperation.
Burden estimates for the prisons related activities outlined above are included in Part A, section A.12 and Table 1 of this submission. Letters, scripts and the brochure used to recruit and gain permission of prison officials, wardens, and inmates are included in Appendix C.
As outlined above for the household-based sample, interviewers will follow similar procedures to document the result of each case using the Interviewer Management System and EROCs. In addition, after each day of interviewing, interviewers will contact their supervisor to discuss any special issues or concerns about the facility or inmate interview process.
The same procedures, instruments, and assessments that were used for the PIAAC Main Study will also be used to conduct the PIAAC National Supplement. To test minor changes required to the interviewing instrumentation, a cognitive test will be conducted with a small number of volunteers recruited for a purposive sample to insure the implemented changes are working as designed.
The following are responsible for the statistical design of PIAAC:
Leyla Mohadjer, PIAAC Consortium/Westat; and
Kentaro Yamamoto, PIAAC Consortium/Educational Testing Service.
Westat will be the contractor responsible for sampling activities:
Leyla Mohadjer, Vice President; and
Tom Krenzke, Associate Director.
Analysis and reporting will be performed by:
Kentaro Yamamoto, Educational Testing Service.
1 Age is determined during the screener questionnaire.
2 The PSUs were stratified by the following: state-level small area estimates (SAE) of the percentage of the population lacking basic prose literacy skills (from the 2003 National Assessment of Adult Literacy), whether the PSU was part of a metropolitan area, race/ethnicity, poverty, English speaking ability, and education attainment.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | PIAAC OMB Clearance Part B 12-15-09 |
Author | Michelle Amsbary |
File Modified | 0000-00-00 |
File Created | 2021-01-29 |