Part B PISA 2012 Validation Study 2015

Part B PISA 2012 Validation Study 2015.docx

Program for International Student Assessment 2012 (PISA:2012) Validation Study 2015 Field Test and Main Study Additional Module Amendment

OMB: 1850-0900

Document [docx]
Download: docx | pdf




PROGRAM FOR INTERNATIONAL STUDENT ASSESSMENT 2012 (PISA:2012) Validation Study 2015 Field test and main study



REQUEST FOR OMB Clearance

OMB# 1850-0900 v.2



SUPPORTING STATEMENT PART B





Submitted by:


National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC



December 8, 2014

Revised July 22, 2015





Table Of Contents


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Potential Respondent Universe and Sampling

The sample for the main PISA Validation Study will consist of students from the national PISA 2012 sample who took PISA mathematics, reading, and science assessments and completed a Student Information Form providing their contact information. PISA 2012 recruitment materials, including materials for parents, stated that students supplying contact information may be contacted by NCES for a future study.

PISA assesses students nearing the "end of their compulsory school experience." For international comparability, the PISA target population is defined as students who are 15 years old, in grades 7 or higher. A range of exact birthdates is specified by the international coordinating committees based on the months in which the data will be collected. However, students must be between the ages of 15 years and 3 completed months and 16 years and 2 completed months at the beginning of the test period. The universe for the selection of schools in the PISA 2012 administration was all types of schools in all states of the United States and the District of Columbia. Within sampled schools, students were selected for participation by drawing a random sample of 50 students from among the age-eligible students (42 of these students were assigned to the mathematics, science, and reading assessment, and 8 to financial literacy).

A total of 6,116 students were assessed in the national administration of PISA in 2012. Of these, approximately 5,810 students (95 percent) completed a Student Information Form. Approximately 306 of the 6,116 students did not complete the Student Information Form. NCES will not pursue these students as part of this survey; by not completing the form they have implied that they do not wish to be contacted for future surveys. Among those that completed the form, 1,081 were assessed in financial literacy, leaving 4,729 students to serve as the starting sample for the main PISA Validation Study. The students who participated in the financial literacy assessment will serve as the population from which field trial respondents will be selected. The students who participated only in the main PISA assessment will serve as the population from which the main Validation Study respondents will be selected.

B.2 Information Collection Procedures

1. Tracing

To date, the tracing activities, described in the previous clearance package, have located more than 90 percent of the 5,810 students from the national PISA 2012 sample who took PISA mathematics, reading, or science assessments in fall 2012 and completed the Student Information Form providing us their contact information. Locating 90 percent of those students a year later included responses to address updates from a portion of the sample, but also relied on intensive tracing, such as calling directory assistance, calling contacts provided by the respondent, and using publicly available Internet searches. In line with other studies that employ locating of participants such as High School Longitudinal Study (HSLS) and Beginning Postsecondary Students (BPS) Longitudinal Study, we found that 12 percent of the 5,810 students responded to the first follow-up request for address verification, and an additional 19 percent to a second hard copy mailing. A limited number of schools have been contacted to request assistance in locating students who could not be located using all other methods.

For study planning purposes, a 10 percent loss of sample each year is expected, resulting in an estimated 3,447 students for the 2015 main study. The 1,081 students assessed in financial literacy were traced and a subsample of 200 students was selected from this group for the field test. The remaining students – those who did not take financial literacy in the field trial and who have been located – will serve as the sample for the main study.

2. Non-response Bias Analysis

Non-response bias can potentially occur when respondents and non-respondents are different. A non-response bias analysis will compare participating and nonparticipating students on key characteristics that are sensitive to nonresponse bias (e.g., age/grade and sex). The analysis will have two aspects: (1) bivariate analysis, in which it is determined if there are significant differences between participating and nonparticipating groups on key characteristics, and (2) logistic regression analysis, in which the key characteristics predict participation and tests of the significance of regression coefficients are performed. Significant regression coefficients indicate possible nonresponse bias. We plan to conduct this analysis on two sets of characteristics: school level, from the PISA 2012 sampling frame, and student level, collected on participating students during the PISA 2012 assessment. Since the nonrespondents will be among students who participated in PISA 2012, we will have a very rich set of characteristics for both respondents and nonrespondents.

3. Weighting

The PISA 2012 sample data file contains survey weights and replicate weights appropriate for the analysis of PISA 2012 data. However, these weights will not be fully appropriate for the analysis of the longitudinal data, since there will be additional nonresponse in the longitudinal data. Therefore, a new set of weights that will make adjustments for this additional nonresponse will be derived.

Since the additional nonrespondents will be among students who participated in PISA 2012, we will have a rich set of data concerning the characteristics of these nonrespondents. This information can be used to provide additional nonresponse adjustments that we can expect to be quite effective in limiting any additional nonresponse bias arising from the nonresponse to the longitudinal component.

A calibration, or generalized regression, approach to the nonresponse adjustments will be used. Because we will have the PISA 2012 data for all the students in the follow-up study, we can use such methods to adjust the PISA sampling weights to account for the additional nonresponse. In doing this, however, we must build into the replicate weighting procedures adjustments that account for the fact that results tabulated from the full PISA sample are themselves subject to sampling error — they are not population census figures, but estimates obtained from the PISA sample. Thus we will implement weighting adjustments that will match the respondent sample to the full PISA responding sample on a variety of student characteristics, such as gender, race/ethnicity, region, urban/rural, grade, socio-economic status, and PISA achievement. This will provide replicate weights that reflect the fact that the distributions of these characteristics have been obtained from the PISA sample and not a census of 15-year-old students.

B.3 Methods for Maximizing Response Rates

1. Tracing and Tracking

Under the previous clearance approved in November 2013 we completed the tracing activity to locate the respondents in the PISA Validation Study sample (OMB 1850-0900 v.1). To date, more than 90 percent of the sample has been located using a multi-modal approach for tracing respondents described in the previous clearance. The challenge at this stage of the study is to maintain contact with potential respondents.

Respondent contact is being maintained using well-established tracking procedures until the main study data collection period begins in fall 2015. We are soliciting for contact updates twice a year (OMB# 1850-0900 v.1). Participants have been able to update their contact information through a secure participant website beginning in November, 2014. Birthday cards are also being sent to all sampled respondents in their birthday month, and the envelopes are stamped “Address Correction Requested.” These multiple mailings serve to update address information for the study and to identify cases that need tracing and locating activities before the trail grows cold.

2. Recruitment

Though tracing and tracking procedures will continue throughout the study, there will be special emphasis and procedures unique to the period immediately preceding and throughout data collection. It is this time that can be thought of as the functional recruitment period. That is the point at which the respondents will be formally asked to participate and complete the survey.

The recruitment phase for the follow-up study is centered on three specific contacts with potential respondents. The first contact is a formal, hard copy letter from NCES re-stating the purpose and importance of the study and formally asking them to participate. This mailing also contains an unloaded cash card. The second contact will be a reminder that the survey is upcoming and a method to obtain the cash card if the first one was lost or never received, thus providing a reminder and a functional reason for the contact. This contact will be made primarily via email and secondarily by hard copy for those potential respondents without a functioning email address. The final contact will occur immediately before the survey opens. It will ask potential respondents to complete the survey and provide their logon credentials to access the online survey. Again, this contact will be made primarily via email and secondarily via hard copy. Appendix A provides examples of these contacts.

Another task specific to recruitment will be refusal or nonresponse conversion. The voluntary nature of participating in this study is protected, and participants may drop out of the study for any reason at any time. Though tracing and tracking procedures will minimize attrition to the greatest degree possible prior to data collection, actually getting respondents to complete the assessment and questionnaire will pose unique challenges. For example, it is feasible that participants may begin but not complete the instruments. To address this challenge, we will use a monitoring system so that timely follow-ups and intervention are possible. Email contact and telephone contact by trained field staff will be used to prompt participants to respond.

A different foreseeable challenge is motivating respondents to begin the instruments. Based on experience in PISA and other volunteer studies, a monetary incentive does well to provide this motivation. As in OECD’s International Assessment of Adult Skills (PIAAC), a $50 incentive (cash card) will be offered to respondents for completing the assessment and questionnaire (see Part A for more detail). Respondents will receive these cards with an official letter from NCES formally asking them to participate and complete the survey, but will only be loaded with the incentive once the respondent has completed the assessment and questionnaire.

3. Data Collection Procedures

The web-based version of the PIAAC data collection instruments--called the Education and Skills Online (ESO) assessment and questionnaire, as described in more detail in Part A--will be administered first to a field trial sample of 200 students drawn from the students who did financial literacy in the September 2015 and again during a 12-week period in winter 2015/2016 to the larger main study sample. Study participants will be invited to log on to a secure website during the data collection window to complete the assessment and questionnaire. Email and telephone contact will be used to prompt respondents to log on to take the assessment and questionnaire or remind respondents to complete the instruments. Telephone interviewers will also be able to facilitate the completion of the questionnaire. Following standardized processes, data collection quality will be maintained by evaluating data as they are collected, the generation and review of weekly sample production reports, and Westat processes used to build and maintain quality.

The field test data collection window is planned to be 6 weeks. For the field test, we will use a predetermined interval of contact and evaluate its effectiveness in converting non-respondents and partial completes. Beginning in the third week after the launch of the survey, we will begin to contact those who have not completed the survey by email/mail followed by telephone to determine if they have not received information about the survey, provide the necessary information to them, and urge them to complete the survey. If the person is not available to discuss the survey, follow-up contacts will be made. Respondents who have only partially completed the survey will be prompted via email/mail to complete the survey and informed that completion is necessary to receive the loaded cash card. This second phase of data collection is planned to operate for 3 weeks in the field test, with weekly review of status reports and follow-up with cases. After a respondent has completed the survey they will be sent a letter thanking them for their participation and their cash card will be loaded. Appendix A provides email/letter text and call scripts for these contacts.

The main study will assume a similar contact interval in which non-respondents will be prompted in the third week of the survey. However, the data collection window will be 12 weeks, allowing a longer effort to reach non-respondents and partial completes. The results of the follow-up efforts in the field trial will be reviewed to see if changes in the planned interval for the main study are necessary.

B.4 Tests of Procedures and Methods

The ESO assessment and questionnaire were piloted by the OECD in 2013 and will, therefore, have established psychometric properties when they are used in this study. The non-cognitive modules were developed from items available in the public domain as well as licensed items used by permission from third parties in the Field Test. The Field Test had sample sizes ranging from about 600 in the United States (Spanish) and Canada (French) to about 1500 in the United States (English) samples. They varied by country and language according to the availability of item parameter information from PIAAC’s existing language versions. Two countries, the United States and Canada, had two-language samples: English and Spanish in the U.S. and English and French in Canada. In the other five countries (Czech Republic, Ireland, Italy, Japan and Spain) only one language sample was drawn. The study was administered from the first half of 2013 to the first half of 2014. The primary purpose of the PISA Young Adult Follow-up Study field trail is to test data collection procedures in anticipation of the main study data collection. We intend to administer the trial to 200 students over a four-week period in summer of 2015.

B.5 Individuals Consulted on Study Design

Many people have been involved in the design of PISA and PIAAC. Some of the lead people are listed in section A8. Overall direction for the PISA Validation Study is provided by Dr. Dana Kelly of NCES, U.S. Department of Education.

i


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePREFACE
AuthorJanice Bratcher
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy