Response to OMB comments

PIAAC 5-21 Response to OMB Passback.docx

Program for the International Assessment of Adult Competencies (PIAAC) 2010 Field Test and 2011-2012 Main Study Data Collection

Response to OMB comments

OMB: 1850-0870

Document [docx]
Download: docx | pdf

Memorandum United States Department of Education

Institute of Education Sciences

National Center for Education Statistics



DATE: May 14, 2010

TO: Shelly Martinez, OMB

FROM: Eugene Owen and Stephen Provasnik, NCES

THROUGH: Kashka Kubzdela, NCES

RE: Responses to 5-12-10 and 5-20-2010 OMB Passback



5-12-10 Passback


Below each of the questions from the OMB passback are our responses. In addition, at the end of this document is an appendix (referenced in the responses).


  1. Incentives: We have studied closely your rationale in SS A9 for a proposed $50 incentive.  Interestingly, the surveys cited as evidence for the proposed incentive -- the NHIS, MEPS and NSDUH -- offer $0, $0 and $30 respectively for incentives.  Other major household surveys with similar or higher burden, such as the CPS, Consumer Expenditure, SIPP, and NHES offer between $0 and $40 for incentives.  The higher amounts of these are all at the experimental stages, meaning that OMB has not approved them for use in production activities at this time.  Therefore, we do not see a justification for using an amount above $35 at this time.  Please adjust the amount in the SS and letters accordingly and resubmit them.


RESPONSE: As mentioned in Section A.9 of our OMB package, many in-person household-based surveys have found that incentives play an important role in improving response rates, especially in surveys that place a greater burden on respondents. Furthermore, it is shown that the incentive amounts should be correlated with the burden level to effectively improve response rates. Unlike most other national household surveys, PIAAC includes an administration of a literacy assessment in addition to a full interview. Based on our research and experience with past literacy surveys, an incentive amount of $50 reflects the added burden of taking a longer interview/assessment than past literacy surveys (from 1.5 hours to 2 hours), an inflation factor based on the years that have elapsed since the 2003 ALL and NAAL, and the increased complexity posed by a computer-based assessment on PIAAC.


In order to more formally demonstrate the need for a larger incentive, we propose the implementation of an incentive experiment during the field test. Ideally such an experiment should include the two amounts in question, that is, $35 and $50. However, we are concerned that the difference between $35 and $50 is not large enough to statistically detect differences between response rates given the small sample size we have for the field test. To compensate for the small sample size, we propose to increase the upper incentive amount to $60 for this experiment. This will allow us to statistically test a five percentage point increase in response rates (see the explanation below). We will then use a modeling approach to predict the results for a $50 incentive.


The general design of the experiment will involve pairing field test PSUs with respect to the following characteristics using current county-level estimates from the Census Bureau: the percentage of persons with a college degree, the percentage of persons below the poverty line, and minority distributions. One of the two incentive payments ($35 and $60) will be assigned randomly to each PSU within the matched pair. The design closely follows a “randomized complete block design,” a variation of a classical experimental design. In such a design, comparisons are made within each “block.” In the incentive experiment, a “block” refers to a pair of PSUs, and “randomization” refers to how the incentive is assigned within each pair.


A power analysis was conducted to determine if the currently planned sample size is sufficient to detect a five percentage point difference in response rates. The power analysis assumed 80% power, that is, the probability of rejecting the null hypothesis, that there is no effect of an increased incentive amount on the response rate, given that it is truly false, is 0.80. The resulting sample size is estimated under the assumption of simple random sampling. In effect, we are assuming that the increase in variance due to the clustering of dwelling units within segments, and segments within PSUs, is off-set by the predictive power gained by the logistic regression model to be used in the analysis (logistic regression models will be processed to determine if the incentive amount makes a significant impact on response propensity). The analysis results show that a sample size of 1,541 is needed in each incentive group (or 3,082 overall) to be 95% confident of a five percentage point difference in response rates between the two incentive groups. Given the total sample of dwelling units (4,478), the expected screener eligibility rate (85%), and the expected dwelling unit occupancy rate (85.8%), the number of eligible dwelling units is expected to be 3,266, slightly larger than the necessary sample size found in the analysis.



  1. What are the cost and quality implications in the full scale study of having to rely on the Census 2000 Summary File 1 block data for the PSU segments versus a more current file?


RESPONSE: Our inability to use the updated Census data for segment formation and selection will affect the PIAAC sample in terms of the cost to the survey and the quality of the resulting survey estimates. At the time of segment selection, the Census 2000 Summary File 1 block data are likely to be less reliable due to growth from new construction and population declines and declines resulting from demolition or housing vacancies. Therefore, segment measures of size based on these data may be quite different from what is found through the dwelling unit (DU) frame building process, resulting in sample sizes that are considerably different from anticipated sample sizes. Further, the measures of size will have increased variability across segments.


Large variations in segment sizes results in large variations in the numbers of ultimate sampling units selected in each segment. This variation decreases interviewer efficiency and impacts the precision of the estimates. Segments which are found to have many more DUs than anticipated will result in large numbers of sampled persons and highly clustered samples within the PSUs. Many more resources will be required to collect data in these segments as well. In such cases it may be necessary to keep only a subsample of DUs from the segment, but doing so deviates from the self-weighting sample design, thus increasing the variation in the ultimate sample weights. This in turn will impact the precision of the final estimates.


There are methods that may be employed to update the measures of size for the segments before the segment selection. These include creating or purchasing available updated intercensal estimates for the sub-county areas. Estimates may be created using building permit data, digital images of the area, or having field staff canvass the area and count the number of visible housings units. Address lists from the United States Postal Service are available for purchase as are other sub-county estimates. All of these methods will increase the cost to the survey as they all require resources that would not be necessary if the 2010 Census Summary File 1 block data were available. In addition to the increased monetary costs, the nonmonetary costs mentioned above still remain in the event that the updated estimates used fail to accurately reflect the segment measures of size.



  1. Questionnaires:

Please explain what cognitive testing of U.S. respondents has been conducted with the background questionnaire.  If none, we want to have a discussion about doing some quickly before PIACC moves forward.


RESPONSE: Cognitive testing of the background questionnaire (BQ) was conducted for PIAAC in selected countries around the world, including in the United States, between August and October 2008 by the international consortium’s contractor GESIS. The goal of the testing was to identify potential problems with questions, provide insight into how the respondents understand questions or terms used in questions, and improve the quality of questions and their validity. Based on the results of that cognitive testing, many of the BQ questions were either revised or dropped by the international consortium.


      1. Specifically, we are concerned that certain response options are not written in familiar U.S. English. For example, what is the practical difference on page B-61 between “A non-profit organization” and “The private not-for-profit sector?” Or, on page B-92, what is the meaning of “I was made redundant or took voluntary redundancy?”


RESPONSE: Although the original international version of the PIAAC BQ was written in English, NCES requested adaptations or translations for many questions so their phrasing would conform to the conventions of standard American English. Not all requested adaptations, however, were allowed by the consortium. Appendix A contains a list of U.S. requested adaptations. In addition, we would be happy to share the entire pre-final US_PIAAC BQ spreadsheet, which includes comments by the consortium. We have not included it in this response only because it is a massive file.


You will note in this list in appendix A that there is no requested adaptation for the response “I was made redundant or took voluntary redundancy.“ The United States initially included a request for this response to be changed to “My job was abolished,” but because this phrase is on a hand card (not in a question in the BQ), it was deleted from the requested adaptations to questions. It was supposed to be submitted in the request for hand card adaptations, but we forgot this item when we compiled our requested set of adaptations to responses. We are requesting this response be revised for the main study BQ.


In regard to the question on page B-61, we agree with you. We do not see any problem with collapsing the categories “non-profit organization” and “private not-for-profit sector?” into a single category as this item is an American item and not an international item. The change will be made in the background questionnaire for the main study.


      1. More generally, there appear to be many departures from the way similar topics are asked in well-established U.S. surveys (e.g., educational attainment, health insurance), so it is not clear that these questions are valid and reliable in the U.S. context.  Please provide evidence to reassure us.


RESPONSE: In regard to educational attainment, NCES adopted standard questions used by NCES surveys and we consulted with NHES and the postsecondary team. We incorporated all of their recommendations in a concerted effort to maintain consistent US data on this topic. Note that since we submitted our OMB package we have learned from the postsecondary team that response category 7 in the attainment items should say “certificate” (NOT “certification”), and we had this correction made before the window closed for changes to the US version of the BQ. (See revised Appendix B that reflects this change since the original submission of the OMB package.) We also have learned that the postsecondary team believes that category 6 should be dropped; however, we learned this too late and cannot make this change to the field test version of the US BQ. We will have this deletion made in the main study version.


As for the items related to health insurance, these questions were included in the 2003 NAAL BQ and we included them to be able to collect comparable data.


    1. Educational attainment questions:

      1. What is the basis for the series of educational attainment questions?  They appear to attempt to include in the “hierarchy” certificates and certifications, which has been widely rejected as an approach by BLS, Census and the postsecondary group at NCES.


RESPONSE: We consulted extensively with the postsecondary group at NCES and wrote the educational attainment questions to match the categories that they specifically requested, including their new hierarchy of certificates and certifications.


      1. Why isn’t there a response option for “Grades 10-12”?


RESPONSE: The NCES postsecondary group and outside experts we consulted with recommended that we drop the category “grades 10-12,” as it is not an internationally comparable category.


      1. How was the list of countries chosen?


RESPONSE: The countries offered as responses to the question “In which country did you gain this [educational] qualification?” were selected from among the countries that had the highest number of legal immigrants coming to the United States from 1999 through 2009.   The top four countries were Mexico, China, India, and the Philippines.  Colombia (the top South American country) was selected to represent South America; Russia to represent Europe. The data to calculate the listing of countries comes from the Department of Homeland Security, Office of Immigration Statistics, Yearbook of Immigration Statistics (http://www.dhs.gov/files/statistics/publications/yearbook.shtm), which has been organized by the Migration Policy Institute (http://www.migrationinformation.org/DataHub/countrydata/country_us.cfm).

Average annual inflow of new legal permanent residents to the United States, by country of birth, Fiscal Years 1999 to 2009

1

Mexico



170,456

2

China (excluding Hong Kong)

60,869

3

India



60,487

4

Philippines


54,759

5

Vietnam



29,307

6

Cuba



28,891

7

Dominican Republic


28,231

8

El Salvador


24,650

9

Colombia



23,024

10

Haiti



20,911

11

Korea



20,715

12

Jamaica



17,412

13

Canada



16,336

14

Guatemala


15,222

15

Pakistan



15,153

16

Ukraine



15,130

17

Russian Federation


14,751

18

United Kingdom


14,690

19

Peru



13,583

20

Poland



11,640



    1. What is the purpose of the set of questions starting on B-145 regarding trust etc?


RESPONSE: These questions come from Dutch researchers at the University of Maastricht (in charge of the development of the BQ) who lobbied successfully for their inclusion in the PIAAC background questionnaire (BQ). The United States opposed this set of questions and is on record as recommending that they be dropped. We hope that the results from the field test will lead to their deletion from the BQ for the main study.


    1. What is the purpose of asking educational attainment, employment etc of spouse/partner?  Also, is this asked even if the other person is in sample?


RESPONSE: These questions come from Dutch researchers at the University of Maastricht (in charge of the development of the BQ) who have justified them on the grounds that “Research has clearly pointed out that an individual’s educational attainment and occupational outcomes are closely related to the educational attainment of the partner” (quoted from page 29 of the PIAAC list of BQ concepts).


Yes, this question will be asked even if the spouse/partner is also in the sample.


    1. Please clarify that the purpose of asking for phone number in the screener is, at stated, solely for quality assurance purposes.


RESPONSE: The purpose of asking for the telephone number in the screener is for contacting the households to validate cases.  Conducting validations over the telephone saves time and resources making it important to collect this information from households. Currently the screener notes parenthetically “(in case my office wants to check my work),” and interviewers are trained to read this note out loud.  If this statement is not sufficient, we can revise this. However, we have tried to keep the reason worded simply (rather than use technical language like “for quality assurance purposes”). 


    1. Why does the background questionnaire ask about race and ethnicity when they also are asked on the screener?


RESPONSE: The screener asks about race and ethnicity because those answers are used for weighting all sampled respondents, including those who do not complete the interview and who, therefore, do not get asked the background questionnaire. The BQ asks about race and ethnicity again because we want the sampled respondent to self-report on these two measures. (Keep in mind that person who responds to the screener may not be the same as the sampled person in the household, and self-reporting is more accurate than proxy reporting).


5-12-10 Passback


  1. Incentives – we are not willing to approve experimenting with $60.  We therefore recommend either revisiting the experiment design (e.g., random assignment at HU level versus at the PSU level, etc) in order to obtain more power or abandoning the experiment.  If the latter, we will approve a $5 increase over NAAL, given the passage of time.  We want to emphasize that this still puts PIACC at the very upper end of any US household survey incentive and therefore we cannot approve anything higher.


RESPONSE: We have revisited the design and the power analysis related to the incentive experiment during the PIAAC field test. We propose the revised design to be at the segment level (clusters within PSUs) rather than HU level. Conducting the experiment at the HU level will promote the chance of errors in administering the incentives to the respondents, and it runs the danger of spreading information about different amount of incentives among respondents in close neighborhoods and in households with two sampled adults. A review of the results of the power analysis has shown that we can relax the associated parameters (more specifically changing the value of alpha from 0.05 to 0.1) to allow the detection of a five percent difference in response rates between the incentive levels of 35 and 50 dollars. Thus we propose making this change as well.


  1. Related, we would like to know whether any other country if offering incentives in PIACC and at what level.


RESPONSE:  To the best of our knowledge, 21 of the other 26 countries participating in PIAAC are offering a respondent incentive of some sort.


Seven countries are offering monetary or voucher incentives of varying amounts. The amounts are as follows (US dollar amount provided in parentheses):

Austria: 50 EUR (USD 62);

Czech Republic: 500CZK (USD 24);

Ireland: 40 EUR voucher (USD 49);

Korea: 100,000 KRW (USD 87);

Netherlands: 20 EUR (USD 25);

Sweden: 10 EUR (USD 12);

United Kingdom: 20 to 30 GBP (USD 29 to 43).


Three countries are conducting experiments with monetary and non-monetary incentives.  The amounts offered are as follows:

Denmark: 0/10 DKK (USD 0/2);

Germany: 25/50 EUR (USD 31/62);

Norway: 25/600/1000 NOK (USD 4/96/160).


Eleven countries are offering a non-monetary incentive (but we do not have details on what these are).



  1. We also would like to understand how close to the projected release date of the Census block data file the cut off for PIACC full scale is and whether NCES has talked with the Census Bureau about this matter.


RESPONSE: The block-level Census data needed for forming and selecting segments during the second stage of selection will be released during March 2011.  This will give us only four months until the start of the PIAAC data collection. The listing operation usually begins 6 months prior to data collection, therefore there is not enough time allowed for forming and selecting the segments. There is a possibility of a delay to the start of data collection as being discussed by OECD. NCES is awaiting a decision to be made in fall of 2010 by the PIAAC Consortium regarding any delay. If there is no delay, or the delay is not long enough to accommodate the use of the new Census data, then NCES will consider contacting the Census Bureau to see if the data would be delivered earlier than scheduled.


  1. Please be advised that OMB does not plan to approve the “trust” questions without substantial justification not yet provided for full scale.  Please be sure that the international sponsors understand OMB’s position.


RESPONSE:  We share OMB’s misgivings about these questions, but we cannot delete these items from the BQ field test instrument, as it has already been prepared by the international consortium and delivered to the US and all other participating countries.  Should these questions survive the winnowing of BQ items that will occur after the field test and appear on the full scale BQ, and should adequate justification not be forthcoming, we will notify OMB to discuss the situation.  In the meantime, we will continue to advocate dropping these items and we will inform the consortium of OMB’s position.


  1. We remain unconvinced of the utility of burdening individuals with questions about their spouses or partners if the spouse of partner is known to be in sample.  Unless there is a specific justification submitted, we will not approve asking these questions twice.


RESPONSE:  We have contacted the International Consortium in regard to this issue and we concur with them that a situation where both spouses in a household are sampled by PIAAC will be very rare and, if it occurs, it will happen in no more than a few cases. Moreover, given the few questions about one’s spouse, this is a very small burden. The alternative—not asking these questions twice in this special situation—would require major changes to the routing of the BQ, using a criteria that is actually independent of the content of the BQ.  And because the screener is the responsibility of each country, information from the screener cannot be easily incorporated into the internationally programmed BQ routing path. Furthermore, in order for the data analysis to meet the goals of PIAAC, the International Consortium wants all countries to ask the entire set of questions of everyone.  Having countries cherry-pick the questions they ask, undermines the data analysis and the spirit of cooperation that undergirds and makes possible such an international assessment.


  1. Finally, we apologize for not raising this earlier, but please provide a justification for the battery of questions on health and health conditions.


RESPONSE:   Health-related questions were taken from the 2003 National Assessment of Adults Literacy (NAAL)BQ.   The decision to include the battery of questions on health and health conditions was made based on our knowledge about the tremendous interests on the health literacy issues among state and federal government health agencies as well as among health practitioners and researchers in the same questions covered in the BQ of the 2003 NAAL.  In response to a request from U.S. department of Health and Human Services, NCES in 2006 analyzed the NAAL health data and published a comprehensive report on the health literacy of American’s adults (http://www.nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2006483), which contributes a whole chapter (chapter 3) to examining the relationship between health literacy and self-reported overall health, health insurance coverage and sources of information about health issues.  Based on these data, the U.S. Surgeon General conducted a Workshop on Improving Health Literacy, on September 7, 2006, where Grover Whitehurst, then-Director of Institute of Education Sciences, presented the results on the Health Literacy data.


In addition, secondary analyses of the NAAL health BQ data have been widely conducted by researchers, more so than using any other NAAL BQ data. The following is a list of a few published studies that have drawn upon the NAAL data on health-related issues.



  • America's Health Literacy: Why We Need Accessible Health Information. An Issue Brief From the U.S. Department of Health and Human Services. 2008 (http://www.health.gov/communication/literacy/issuebrief/). This issues brief discusses health insurance and sources of health information.


  • The Contribution of Health Literacy to Disparities in Self-Rated Health Status and Preventive Health Behaviors in Older Adults (http://www.annfammed.org/cgi/content/full/7/3/204).  This article made use of the data on health status and preventive health measures.



  • Relationship of preventive health practices and health literacy: a national study (http://www.ncbi.nlm.nih.gov/pubmed/18067463).  This article used the data on preventive health measures, insurance and health status.


The American version of the PIAAC BQ thus included the most widely studied questions on heath:



  • I_Q10bUSX1: health insurance status


  • I_R10bUSX2a to I_R10bUSX2h: sources of health information


  • I_R10bUSX3a to I_R10bUSX3a: preventive health measures


Gathering data on these questions in PIAAC will yield updated information on the variables above.

APPENDIX A.


PIAAC BQ V5.0 – National Adaptations


B: START Education and Training

  • Throughout this section, and the assessment, PIAAC uses the term qualification when referring to educational attainment. In the U.S., we typically use the terms degree or certificate or level of education, depending on the context. The national adaptation guidelines in PIAAC allow country-specific modifications to educational terminology wherever necessary. Some changes require simply substituting the word qualification for degree or certificate, and some items need to be reworded. Please make the word substitutions wherever they are necessary (e.g. B_Q03c1: When you stopped studying for this qualification, how old were you or what year was it?

When you stopped studying for this degree or certificate, how old were you or what year was it?). Listed below are the items that need to be reworded. Items in bold are the PIAAC questions that use the term qualification; our suggested changes are written in italics.


B_Q01a: Which of the following qualifications is the highest you have obtained?

What is the highest level of education you have completed?

B_Q02b: What is the level of the qualification you are currently studying for?

What type of degree or certificate are you currently studying for?

B_Q03a: Did you ever start studying for any formal qualification, but leave before completing it?

Did you ever begin a program of study for a degree or certificate, but leave before completing it?

B_Q03b: What was the level of the qualification you started studying for? If more than one, please report the one with the highest level.

What was the type of degree or certificate you began studying for but did not complete? If more than one, please report the one with the highest level.

B_Q05a: What was the level of this qualification?

What type of degree or certificate was this?

B_Q05c: Were the main reasons for studying for this qualification job related?

If PIAAC allows, we suggest replacing with question with item AD6 from NHES in order to preserve trend data with the NHES assessment.

AD6. Did you take the (DEGREE/CERTIFICATE) program (in (MAJOR)) mainly for work-related reasons

or mainly for personal interest?

CRREAS1- WORK-RELATED........................................................................1

CRREAS3 PERSONAL INTEREST ................................................................2

BOTH EQUALLY.........................................................................3


B_Q11: Did an employer or prospective employer pay partly or totally for tuition or registration, exam fees, expenses for books or other costs associated with your study for this qualification?

Did an employer or prospective employer pay partly or fully for tuition or registration, exam fees, expenses for books or other costs associated with your participation in this degree program?

B_Q12a: During the last 12 months, have you participated in courses or private lessons?

INTERVIEWER:

1. Courses are typically subject oriented and taught by persons specialised in the field(s)

concerned. They can take the form of classroom instruction (sometimes in combination with

practice in real or simulated situations) or lectures.

Interviewer:

  1. Courses are typically subject oriented and taught by persons specialised in the field(s) concerned. They can take the form of classroom instruction (sometimes in combination with practice in real or simulated situations) or lectures, and are conducted face-to-face.


B_Q12b: How many of these activities did you participate in?


After this item, B_Q12d, B_Q12f, and B_Q12h we suggest adding NHES items AG3 and AG4 in order to preserve trend data with the NHES assessment.


AG3. With your help, I’m going to make a list of the courses you took where there was an instructor.

(Again, not counting the programs we talked about earlier,) please tell me the name and

subject matter for each course you have taken in the past 12 months. [MAY RECORD UP TO 20

COURSES.]

FCNAME1/R- COURSE NAME _____________ SUBJECT ___________________

FCNAME20/R COURSE NAME _____________ SUBJECT ___________________

FCSUBJ1/R- COURSE NAME _____________ SUBJECT ___________________

FCSUBJ20/R COURSE NAME _____________ SUBJECT ___________________

COURSE NAME _____________ SUBJECT ___________________

COURSE NAME _____________ SUBJECT ___________________


For each course listed in AG3, ask AG4.


AG4. Did you take the (COURSE NAME) course mainly for work-related reasons or mainly for personal

interest?

FCREAS1/R- WORK-RELATED........................................................................1

FCREAS20/R PERSONAL INTEREST ................................................................2

BOTH EQUALLY.........................................................................3


B_Q12c: During the last 12 months, have you participated in courses conducted through open

or distance education?

During the last 12 months, have you participated in distance education courses?


C: Current status and Work history

  • PIAAC allows national adaptations for items C_Q04a-j. Our suggestions are listed below.

C_Q04a: In the four weeks ending last Sunday, did you do any of these things.. get in contact with a public employment office to find work?

In the past 4 weeks ending last Sunday, did you do any of these thingscontact a state, county, or local government employment office to find work?

C_Q04b: get in contact with a private agency (temporary work agency, firm specialising in recruitment, etc.) to find work?

contact a private agency (temporary work agency, firm specializing in recruitment etc.) to find work?

C_Q04c: apply to employers directly?

apply directly for a job?

C_Q04d: ask among friends, relatives, unions, etc. to find work?

ask among friends, relatives, or at a union office to find work?

C_Q04e: place or answer job advertisements?

place or answer a job advertisement?

C_Q04f: study job advertisements?

look through job advertisements?

C_Q04g: take a recruitment test or examination or undergo an interview?

take a recruitment test or go for an interview?

C_Q04h: look for land, premises or equipment for work?

look for property or equipment for a business venture?

apply for permits, licences or financial resources for work? (NEW)

C_Q04j: do anything else to find work?

do anything else to be employed?

D: Current Work

  • If consortium accepts, we suggest adding a category for private, not-for-profit organizations as part of our national adaptation. This change will be advantageous for all groups involved because PIAAC will still receive all the data needed for cross-national comparisons of public, private and non-profit organizations, and the U.S. will have the option of analyzing the data further for non-for-profit organizations.


D_Q03: In which sector of the economy do you work? Is it .. 1 The private sector (for example a company) 2 The public sector (for example the local government or a state school) 3 A non-profit organisation (for example a charity, professional association or religious organisation)

In which sector of the economy do you work? 1. The private sector (for example a company), 2. The public sector (for example the local government or a state school) 3. A non-profit organization (for example a charity, professional association or religious organization) or 4. The private not-for-profit sector

D_Q09: What kind of employment contract do you have? Is that …

ADAPTATION INSTRUCTION: Countries should use the appropriate national wording of

response categories, e.g. permanent/temporary contract or contract of unlimited/limited

duration. Some countries may add the phrase ‘or probationary period’ to the specification

of indefinite contracts.

INTERVIEWER: Read categories to respondent

1 An indefinite contract (go to D_Q10)

2 A fixed term contract (go to D_Q10)

3 A temporary employment agency contract (go to D_Q10)

4 An apprenticeship or other training scheme (go to D_Q10)

5 No contract (go to D_Q10)

6 Other

DK (go to D_Q10)

RF (go to D_Q10)

As part of our national adaptation we would like to exclude this question from the assessment. We do not feel it will produce meaningful results in the United States.

D_Q12a: Still talking about your current job: If applying today, what would be the usual qualifications, if any, that someone would need to GET this type of job?

If someone were applying for your current job today, what level of education, if any, would that someone need to GET this type of job?

E: Last Job

  • For item E_Q03, please make the same changes that are suggested for item D_Q03.


F: Skills used at work

  • In this section, we wonder if we can change the wording of a few items to make them more colloquial and therefore easier for respondents to understand.

F_Q01b: cooperating or collaborating with co-workers?

working cooperatively or collaboratively with co-workers?

F_Q02e: advising people?

providing advice?

F_Q04a: persuading or influencing people?

working to persuade or influence people?

G: Skill Use Literacy, Numeracy and ICT at work

  • In this section we suggest changing undertake/undertook to do/did, as it is more colloquial. (e.g. G_R02: The following questions are about writing activities that you ^UndertookUndertake as part of your ^JobLastjob. Include electronic writing.

The following questions are about writing activities that you ^DoDid as part of your ^JobLastjob. Include electronic writing.)

H: Skill Use Literacy, Numeracy and ICT at work

  • Please make the same changes that are suggested for Section G.

J: Background information

  • PIAAC allows using national adaptations where appropriate in this section. So we suggest modifying item J_Q04c1 because immigration is a politically sensitive issue in the U.S. We also suggest modifying J_Q08 because the metric system is not typically used in the U.S.

J_Q04c1: At what age or in which year did you first immigrate to #CountryName?

When did you first come to the United States?

J_Q08: About how many books ^AreWere2 there in your home ^When? Do not include magazines, newspapers or schoolbooks. To give an estimation, one meter of shelving is about 40 books.

Please change the end of the statement to say, ‘one foot is about 10 books.’

J_Q09d: early retirement benefits?

As part of our national adaptation we would like to exclude this item from the assessment because we feel it is redundant with item J_Q09e, which asks about retirement benefits.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTO:
AuthorAuthorised User
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy