Volume 1 - CPOS 2020 Cognitive Testing

Volume 1 CPS 2020 Cognitive Testing.docx

NCES System Clearance for Cognitive, Pilot, and Field Test Studies 2019-2022

Volume 1 - CPOS 2020 Cognitive Testing

OMB: 1850-0803

Document [docx]
Download: docx | pdf



2020 CPS School Enrollment

Cognitive Testing



Volume I




OMB# 1850-0803 v.269




National Center for Education Statistics (NCES)



May 2020




Contents


Attachment 1 – Communication Materials and Consent

Attachment 2 – Recruitment Screener

Attachment 3 – Interview Protocols


Justification

The Current Population Survey (CPS) is one of the oldest, largest, and most well-recognized surveys in the United States. It is immensely important, providing information on many of the things that define us as individuals and as a society – our work, our earnings, and our education.

In addition to being the primary source of monthly labor force statistics, the CPS is used to collect data for a variety of other studies that keep the nation informed of the economic and social well-being of its people. This is done by adding a set of supplemental questions to the monthly basic CPS questions. Supplemental inquiries vary month to month and cover a wide variety of topics such as child support, volunteerism, health insurance coverage, and school enrollment. Supplements are usually conducted annually or biannually, but the frequency and recurrence of a supplement depend completely on what best meets the needs of the supplement’s sponsor.

The School Enrollment Supplement (OMB #0607-0464) is one such supplement. The intent of this supplement is to provide information on the population 3 years old and older on school enrollment, junior or regular college attendance, and high school graduation. It has been fielded annually from October 2005 to October 2019.

CPS Cognitive Testing

This request is to conduct cognitive testing of potential additions to the School Enrollment Supplement. If the new portions work well during testing, then those revisions will be considered for incorporation into the data collection instrument to be used in October 2020. Specific goals include testing of how the coronavirus (COVID-19) pandemic affected schooling at an individual child level, such as:

    • how children received instruction in a distance-format, if at all;

    • how much digital and internet access children had, if at all;

    • how much live interaction children had with teachers, if at all; and

    • how much time was spent on distance learning.


This testing will be conducted on a subset of the English language questions. We will conduct these interviews over the phone, because the actual survey is also interviewer administered.

During testing, we will keep track of spontaneous comments about the questions in the questionnaire. We will also note any response errors (i.e., either missing data or incorrect responses) based on participant verbalizations. We will debrief each participant session, focusing on the new question wordings.

The survey questions will be evaluated in terms of respondent’s understanding of the questions and ability to answer them. The primary deliverable from this study will be a report about the participant results.

Design

An interviewer-administered paper protocol will be used for the cognitive interviews. We will conduct these user sessions over the phone.

Participants will electronically sign a consent form through a link emailed to them. They will also give oral consent at the start of the interview.

During testing, each participant will be asked some of the demographic questions of CPS, and the entire School Enrollment supplement, with the new questions added. Participants may be asked to think aloud (verbalizing what they are thinking) as they complete the survey. Think aloud data helps identify problems and the causes of the problems.

Interviewers will ask probing questions as needed. After the completion of the survey, participants will be debriefed about their experience answering the survey. Each session is expected to last sixty minutes. See Attachment 3 for the full interview protocol.

We will recruit a total of no more than 20 English-speaking participants.

The following data collection methods will be used to collect participants’ performance data:

  • think-aloud protocol with minimal probing such as “Keep Talking;” “What are you thinking?” and acknowledgement tokens (linguists refer to this as backchannels) such as “Um-hum?”;

  • real-time verbal observation by the researcher;

  • targeted probes when necessary (e.g., “What were you thinking when you answered that question?”);

  • retrospective debriefing; and

  • audio recording.

The interviews will be audio-recorded. Analysis of the data will include qualitative analysis of behavioral observations, spontaneous verbalizations, and answers to debriefing questions in order to identify problems.

Recruiting, Interview Protocol, Instrument Specifications and Paying Respondents

To ensure that we can recruit participants from all desired populations and to thank them for completing the interview, each respondent will be offered $40 for participation in a sixty-minute interview.

We will attempt to recruit households with at least one child who is enrolled in grade K-12 in a private, public, or charter school.

We will attempt to recruit participants from a variety of educational backgrounds and from different states. There appears to be a lot of geographic variation in how schools responded to the pandemic and in how they attempted to implement distance-learning.

Participants will be recruited by the U.S. Census Bureau, using multiple sources. We will post an invitation on online neighborhood listservs, Craigslist, and social media. We will also send that invitation to personal contacts. Recruitment contact materials including information about remote sessions being conducted by phone, information about cash payment and signed vouchers for payment are included in Attachment 1. The questions used to screen respondents for participation are included in Attachment 2.

The cognitive interview protocol is included in Attachment 3. The cognitive interviews will begin by walking participants through the School Enrollment Supplement as it currently stands. The questions that are being tested in this project begin on p. 13 of Attachment 31.

The phone sessions will occur with the Center for Behavioral Science Methods (CBSM) interviewer in his/her home and the test participant in his/her home. For phone sessions, the test participant will only need to have access to a phone. The consent and voucher forms can be signed before the session through a link that we will e-mail. This link requires only minimal internet access and can be accessed and signed on a smartphone or computer. In the event participants don’t sign before the session we will obtain consent orally on the recording and ask they sign the voucher before we can mail the $40. If a participant has no internet access to sign the consent/voucher we will obtain oral consent via phone and then mail a voucher form for them to sign. Prior to all remote sessions, we will email participants the consent and voucher form link. Each session will be conducted one-on-one, i.e., one participant and one researcher.

Assurance of Confidentiality

Participation in this cognitive study is voluntary. For remote sessions, respondents will be emailed the consent form and will provide oral consent, which will be recorded. If contact and recruitment are all by phone, consent will be requested at the outset of the interview session and recorded. The confidentiality statement and consent form are provided in Attachment 1. Data entered into the survey will be stored in the GovCloud in a secured FEDRamp Moderate environment with a Census Authority To Operate (ATO)2 to collect PII.

The interviews will be audio-recorded. Participants will be assigned a unique identifier (ID), which will be created solely for data file management. The participant ID will not be linked to the participant’s name. The audio-recorded files will be secured for the duration of the study – with access limited to key U.S. Census Bureau and NCES project staff. Interviews may also be listened in on or otherwise observed by key project staff. Participants will be informed when observers attend.

The Paperwork Reduction Act requires respondents to be informed about the authorities under which the information is collected and protected, even if they are not selected to participate. To meet this requirement everyone who calls in to be screened will be sent an email with the below text. In the event they do not have an e-mail a CBSM employee will read it to them after screening them.

Thanks for taking some time to answer my questions today. If you qualify, someone will be back in touch with you soon to schedule an interview.

The U.S. Census Bureau is required by law to protect your information. We are conducting this voluntary survey​ on behalf of the National Center for Education Statistics under the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C., § 9543). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

Your privacy is also protected by the Privacy Act, Title 5 U.S. Code. Routine uses of these data are limited to those identified in the Privacy Act System of Record Notice titled, “SORN COMMERCE/Census-7, Demographic Survey Collection (non-Census Bureau Sampling Frame).”

We estimate that completing these screening questions will take 10 minutes on average. This information collection has been approved by the (OMB). You can validate that this survey is a legitimate federally-approved information collection using the Office of Management and Budget's approval number 1850-0803, which expires June 30, 2022. We are required to tell you this number to conduct this survey. Send comments regarding this estimate or any other aspect of this survey, including suggestions for reducing the time it takes to complete this survey to [email protected].

Estimate of Hour Burden

Screening potential participants will require 10 minutes per screening, and we anticipate needing to conduct 80 screening interviews to yield 20 participants for sessions. We expect the email with instructions to sign the consent/voucher form to take approximately 5 minutes, and the same time will be needed for cases where email is not possible and communications are by phone only. Finally, we expect each cognitive interview to last approximately sixty minutes. The burden on participants is a total of 36 hours, as seen in the table below.



Estimated response burden for 2020 CPS School Enrollment Cognitive Testing

Respondents

Number of Respondents

Number of

Responses

Burden Hours per Respondent

Total Burden Hours (in hours)

Recruitment Screener

80


80

0.17

14

Consent Procedure

20

*

20

0.08

2

Cognitive Interview

20

*

20

1

20

Total

80


120


36

*A subset of all recruited; does not contribute to the total number of respondents



Estimate of Cost Burden

There is no direct cost to respondents.

Project Schedule

Recruitment will begin upon OMB approval. Interviewing is expected to be completed within 6 weeks of OMB approval. Please see the estimated project schedule below.

Estimated Project Schedule for 2020 CPS School Enrollment Cognitive Testing

Start

Activity

End

5/14/20

OMB approves package

5/15/20

5/14/20

Begin recruiting

6/18/20

5/18/20

Cognitive testing

6/26/20

6/1/20

Draft and deliver findings and recommendations; interim briefing

6/26/20

6/1/20

NCES and ADDP review findings and recommendations

6/26/20

7/7/20

Prepare final report – 6 weeks

8/10/20

Cost to the Federal Government

The estimated cost to prepare for, administer, and report the results of these cognitive interviews is approximately $50,000. The cost includes salaried labor for staff and other direct costs associated with the organization of the interviews.

1 An earlier draft of the questions being tested was submitted for OMB approval on the 12th of May and received generous feedback from OMB. The questions submitted here reflect edits to better align the items being tested for inclusion in the October supplement with those already being used by Census and other agencies. The differences between the language presented here and already approved language (for example, references to “spring of 2020” rather than tighter time frames) have been deliberately retained as we think they will be more useful to respondents by the time the final items are used in October 2020.

2 This authorization means that the environment where the survey data will be stored has passed Census Bureau security standards.

6

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorandy
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy