Volume I NPSAS 2019-20 Pretesting

Volume I NPSAS 2019-20 Pretesting.docx

NCES Cognitive, Pilot, and Field Test Studies System

Volume I NPSAS 2019-20 Pretesting

OMB: 1850-0803

Document [docx]
Download: docx | pdf





National Center for Education Statistics (NCES)



Volume I

Supporting Statement



2019-20 National Postsecondary Student Aid Study (NPSAS:20) Pretesting


Survey Tryouts and Focus Groups

Focus Groups to Assess Recruitment Materials

Opinion Survey Using Online Crowdsourcing Platform

Institution Focus Groups and Usability Testing





OMB# 1850-0803 v. 243







December 2018






Attachments


Attachment I – Student Survey Tryouts and Focus Groups: Recruitment Procedures and Materials

Attachment II – Student Survey Tryouts and Focus Groups: Eligibility Screening Questions

Attachment III – Student Survey Tryouts and Focus Groups: Consent to Participate in Research

Attachment IV – Student Survey Tryouts and Focus Groups: Focus Group Protocol

Attachment V – Student Survey Tryouts and Focus Groups: Survey Facsimile

Attachment VI – Student Focus Group/Audience Assessment Recruitment Materials

Attachment VII – Student Focus Group/Audience Assessment Protocol

Attachment VIII – Student Focus Group/Audience Assessment Materials

Attachment IX – Student Opinion Survey Using Crowdsourcing Platform

Attachment X – Focus Groups with Institution Staff: Recruitment Procedures and Materials

Attachment XI – Focus Groups with Institution Staff: Screening Questions and Intake Form

Attachment XII – Focus Groups with Institution Staff: Consent to Participate in Research

Attachment XIII – Focus Groups with Institution Staff: Focus Group Protocol

Attachment XIV – Usability Testing of the PDP: Recruitment Procedures and Materials

Attachment XV – Usability Testing of the PDP: Screening Questions and Intake Form

Attachment XVI – Usability Testing of the PDP: Consent to Participate in Research

Attachment XVII – Usability Testing of the PDP: Usability Testing Protocol

Attachment XVIII – Focus Groups with Institution Staff & Usability Testing: Student Records Instrument

Attachment XIX – Focus Groups with Institution Staff & Usability Testing: PDP Website Content


Submittal-Related Information

The following material is being submitted under the National Center for Education Statistics (NCES) generic clearance agreement (OMB# 1850-0803), which provides NCES the capability to improve data collection instruments by conducting testing, such as usability tests, focus groups, and cognitive interviews to improve methodologies, survey questions, and/or delivery methods.

This request is to conduct testing of components of the 2019-20 National Postsecondary Student Aid Study (NPSAS:20) full-scale data collection, including:

  • Survey tryouts and follow-up focus group sessions with participants who are students enrolled in postsecondary education in the 2018-19 academic year;

  • Focus groups to assess study recruitment materials with participants who are students enrolled in postsecondary education in the 2018-19 academic year;

  • An online opinion survey of study recruitment materials and incentive plans using a crowdsourcing platform with participants currently or recently enrolled in postsecondary education; and

  • Focus groups and usability testing with staff from postsecondary institutions.

Each of these is discussed below. Testing will begin in January 2019, in preparation for the NPSAS:20 full-scale data collection (OMB# 1850-0666), which will begin in October 2019. RTI International will collect NPSAS:20 data from institutions and students on behalf of NCES under contract to the U.S. Department of Education. EurekaFacts is RTI’s subcontractor for the survey tryouts and focus group follow up with students and the focus groups with institution staff; StratComm is the RTI subcontractor conducting the focus groups with students to assess recruitment materials; and Amazon’s Mechanical Turk (MTurk) will provide a convenience sample for an online opinion survey about recruitment materials and incentive plans. RTI will coordinate the MTurk work.

The overarching purpose of NPSAS is to collect data on how students pay for their postsecondary education in a specific academic year. The majority of the contents of the student and institution data collection instruments to be used for the upcoming NPSAS:20 full-scale data collection have been previously tested or were included in prior NPSAS surveys, other NCES studies, or other surveys of postsecondary students. The testing described in this submission allows NCES to evaluate selected components that are either new to NPSAS or revised for its use, before their inclusion in the NPSAS:20 full-scale data collection. The results of the online opinion survey will also be used to inform the approach to recruitment materials and incentives for the 2016/20 Baccalaureate and Beyond Longitudinal Study (B&B:16/20; OMB# 1850-0926).

This submission describes all aspects of the NPSAS:20 pretesting, including recruitment and screening of participants, participant consent forms, pretesting protocols, and the procedures we will use to ensure quality, performance, and reliability of testing results. The results – which will inform potential survey and design modifications intended to refine the full-scale data collection, improve institution and student response rates, and reduce burden – will be submitted for clearance in March and June 2019 as part of the NPSAS:20 full-scale institution and student data collection requests respectively (OMB# 1850-0666).

Background

NPSAS:20, conducted by NCES, is a nationally representative study of how students and their families finance education beyond high school. NPSAS:20, the tenth cycle in the series, will be conducted from October 2019 to December 2020 to capture information pertaining to the 2019-20 academic year. NPSAS:20 will also serve as the base year data collection for the Beginning Postsecondary Students Longitudinal Study (BPS) – a study of first time, beginning college students at two years (BPS:20/22) and five years (BPS:20/25) after their first year of college enrollment.

Survey Tryouts and Focus Groups with Students

The requested survey tryouts and focus groups, conducted with participants who are students enrolled in a college, university, vocational, or trade school during the 2018-19 academic year, will be used to refine the survey questions planned for NPSAS:20 sample members in order to maximize the quality of data collected, and provide information on issues with important implications for the survey design, such as the following:

  • The extent to which terms in questions are comprehended, including updated and added terminology;

  • The thought processes used to arrive at answers to survey questions;

  • Appropriate response categories to questions;

  • Sources of burden and respondent stress;

  • How users interact with the survey, which has been optimized to adjust to different screen sizes, including smaller mobile devices;

  • Ease of survey navigation on all devices, including desktop, laptop, and mobile devices (tablet or smartphone); and

  • The appeal of nonmonetary incentives.

To address prior public comments submitted during the review of the NPSAS:18-AC forms clearance package (OMB# 1850-0666 v.20), measures of food security for postsecondary students will be included in the NPSAS:20 survey. The measures will be tested in the upcoming tryouts and focus groups before being administered in the full-scale national data collection. Staff from the U.S. Department of Agriculture (USDA) and other external content experts were consulted to identify which U.S. Household Food Security items would be appropriate to include. This will be the first federal survey to collect food security data on a national postsecondary student population.

Because the USDA items are anchored to a 30-day or 12-month timeframe while most constructs measured in the NPSAS survey are anchored to the academic year of interest (e.g. 2018-2019 for the pretesting), NPSAS:20 qualitative testing will evaluate the USDA items using two different anchor timeframes: “in the last 30 days” and “in the 2018-2019 academic year.” Tryout respondents will be randomly selected to receive either the validated 30-day or the experimental academic-year timeframe questions, followed by an embedded debriefing question and specific focus group probes. The results will inform a decision on which timeframe to use in the NPSAS:20 full-scale survey. See Attachment V, questions N20F2USDAHH through N20FUSDAAD3, for more detail.

Focus Groups with Students to Assess Recruitment Materials

Focus groups will be conducted to provide information on aspects of the survey design and supply respondent feedback on the effectiveness of the recruitment materials and the appeal of monetary and nonmonetary incentives in encouraging participation in research. Participants will be asked to evaluate the appeal of envelope, brochure, and infographic designs as well as various incentive options and to assess how likely they would be to respond during data collection based on the characteristics of these factors. The results will help to determine appropriate design features for NPSAS:20 recruitment materials.

Opinion Survey with Students Using Online Crowdsourcing Platform

The opinion survey using Amazon’s MTurk will allow NCES to quickly recruit participants for a survey intended to explore the appeal of various aspects of the NPSAS:20 study design before inclusion in the full-scale data collection. Questions will focus on the relative appeal of mailing designs and nonmonetary versus monetary incentive offers. The results, together with the results of the qualitative evaluations described above, will be used to frame the NPSAS:20 design for student data collection.

Upon review of options in 2017 and 2018 in crowdsourcing platforms and online survey sample providers, the National Center for Science and Engineering Statistics (NCSES) at the National Science Foundation (NSF) selected MTurk as one of the most promising because it has a larger available sample than other crowdsourcing platforms and has the most extensive features for managing a convenience sample. Besides NCSES, the Bureau of Labor Statistics (BLS) and the National Cancer Institute (NCI) at the National Institutes of Health (NIH) also use MTurk successfully to recruit participants to complete online surveys.

Focus Groups and Usability Testing with Institution Staff

The focus groups with institution staff are intended to improve the procedures and resources used for NPSAS:20 institution data collection. Prior rounds of focus groups with institution staff focused on instrument terminology and definitions. Testing for NPSAS:20 will focus on further improving the quality of the data collected and on minimizing burden on participating institution staff. Testing participants will be asked for feedback on the timing of the data request, the information and instructions provided to institution staff, and the function of the instrument used to collect student records data. The results of this testing will inform changes to the instrument administered for NPSAS:20 student records collection.

Usability testing will center around the Postsecondary Data Portal (PDP) website, the primary data collection site for institution staff, to improve its user interface and function. Participants will be asked for feedback on the content and navigability of the website, the ease with which users can find needed information, and the mode options for providing data. The results of this testing will inform changes to the PDP and to the NPSAS:20 student records collection instrument.

Design and Context

The purpose of this study is to conduct four different instrumentation and study design evaluations. The design of each is discussed separately below.

Survey Tryouts and Focus Groups with Students (Attachments I-V)

Survey tryouts and focus groups with individuals will be conducted with participants who are similar to the student sample that will be selected for NPSAS:20. EurekaFacts staff have extensive experience in cognitive and usability testing methodologies and focus group moderation. They will recruit participants, conduct the focus group sessions, compile audio and video recordings of each session, and report the results. Participants will be offered a $90 gift card for completing both a survey tryout and a focus group in person and a $30 gift card for completing the stand-alone survey tryout via web.

Survey tryouts will be conducted using (a) a subset of items proposed for inclusion in the NPSAS:20 full-scale student survey and (b) debriefing questions embedded after critical survey items to elicit item-level information about cognition, comprehension, and usability. The survey tryout will also contain targeted questions about incentive and communication preferences (see the survey facsimile, Attachment V, for all debriefing questions and survey items to be tested). All surveys will be completed by respondents as a self-administered web survey. About 85% of NPSAS:16 full-scale survey respondents participated via web. We expect an even higher percentage of respondents via web for the NPSAS:20 full-scale study, so asking tryout respondents to complete the survey via web will closely replicate the national survey experience. To help guide respondents through the self-administered survey, on-screen instructions and definitions will be provided on some questions, and help text on all survey forms will be available (see the survey facsimile, Attachment V, for the help text on each survey item). A subsample of survey tryout participants will complete the self-administered survey at EurekaFacts’ offices and will participate in a focus group debriefing session, during which they will be asked to discuss general survey topics, the usability of the survey, and to react to questions about incentive plans for full-scale data collection (see Attachment VIII for the recruitment material designs to be tested; see the focus group protocol, Attachment IV, for a full list of debriefing probes).

In order to achieve 300 completed survey tryouts with the NPSAS:20 target population, including first-time college students, we anticipate that up to 4,000 individuals will need to be invited to participate in the screening. Of the 300 self-administered web survey respondents, 250 respondents will complete the survey remotely on their choice of device, such as a desktop or laptop computer, tablet, or smart phone, with high-speed internet access. Remote testing is convenient and flexible for respondents because they can complete the survey at a time and location most convenient to them, and it allows the 250 remote respondents to use the survey in a real-world environment rather than in a lab setting. There are no limits on the number of surveys that can be completed on each type of device. With 300 survey respondents we anticipate that a sufficient number of surveys will be completed with each type of device to identify potential usability concerns by device type and screen size. The self-administered web survey will take approximately 25 minutes to complete.

The remaining 50 self-administered web survey respondents will complete the survey in-person at EurekaFacts offices on a tablet provided by EurekaFacts. Then, they will participate in a focus group in which they will debrief on the survey experience and discuss incentive options for full-scale data collection. There will be five focus group debriefing sessions with approximately 10 participants in each. EurekaFacts will schedule each session at a time that is convenient for students who work. Each session will last 90 minutes, including approximately 30 minutes for completing the survey and 60 minutes for focus group participation. Focus groups will be designated for specific groups (e.g. FTBs, graduate students, etc.)

Remote participants will be directed to a landing page, hosted by EurekaFacts, where they will log in with a provided passcode. Survey respondents who also participate in a focus group will complete the survey and focus group in-person and will be given a link at the beginning of the session by the session moderator that will launch them directly into the survey. All in-person sessions will be audio and video recorded with capabilities for live remote observation by NPSAS staff. Observers can log on in order to watch the respondent’s face and body language and listen to the debriefing.

The target sample will include individuals who have been enrolled in a college, university, or trade school between July 1, 2018 and the time of testing, including those who are first-time beginners and graduate students (as identified in the eligibility screener; see Attachment II for specific eligibility screener questions). While there is not a required number of these respondents, recruitment efforts will ensure there is adequate representation of each key group.

EurekaFacts will conduct all recruitment of potentially eligible participants by cold-calling from purchased lists, sending e-mails and letters to and posting flyers with organizations where those enrolled in postsecondary education might work or socialize, posting advertisements on social media, and using snowball sampling methods such as referrals and word-of-mouth from other participants. All recruitment of potential survey tryout respondents will be conducted using an online or telephone recruitment screener containing eligibility criteria questions specific to this study to ensure that testing participants qualify for the study (see Attachment II for the eligibility screening questions). Participants completing the survey remotely can be recruited from across the United States while in-person participants will be recruited from the greater Washington D.C. area where the EurekaFacts office is located.

Audio and video recordings of each focus group session will be available to NCES and NPSAS:20 staff at RTI for review. Following the conclusion of each session, EurekaFacts will organize their observations and summarize the common themes and insights from the focus groups to date.

Attachment I in this submission provides the procedures and materials that will be used for recruitment of survey tryout and focus group participants; Attachment II includes the screening questions that will be used to determine eligibility for the survey tryouts and focus groups; Attachment III is the consent form to participate in the research; Attachment IV provides the focus group protocol; Attachment V provides a facsimile of the survey and a table including all survey items with the respective debriefing items embedded in the survey; and Attachment VIII includes the recruitment material designs to be discussed in focus groups.

Focus Groups with Students to Assess Recruitment Materials (Attachments VI-VIII)

StratComm will conduct all recruitment of potentially eligible participants by sending e-mails and letters to and posting flyers at postsecondary institutions, posting advertisements on social media, calling from purchased lists, and using snowball sampling methods such as referrals and word-of-mouth from other participants. The recruitment of potential focus group participants will be conducted using an online or telephone recruitment screener containing eligibility criteria questions specific to this study to ensure that testing participants qualify for the study (see Attachment VI for the eligibility screening questions). Once the applicants have been screened, selected individuals will be invited to participate in one of three focus group sessions. Each of the three focus group sessions will include up to 17 participants, for a total of up to 50 participants, as a result of screening 250 potential participants for eligibility. To the extent possible, participants will be assigned to one of the focus group sessions based on their enrollment status. Focus groups assignments will be prioritized to include first time beginning students, students from 4-yr institutions, and students from 2-yr and less-than-2yr institutions.

Recruiters will contact participants primarily via e-mail. Attachment VI provides the recruitment procedures and materials that will be used for recruiting potential participants, including the screening questions that will be used to determine eligibility for participating in focus groups and contains the consent form, to which all participants will be asked to agree before the session begins.

Focus groups will be conducted using StratComm’s Illumination Lab (iLab) online platform. Because the sessions will be conducted remotely, participants will need a computer with high speed internet access and a phone connection. Participants will log into the iLab system using instructions provided and join the conference call with a moderator and other participants. A moderator will provide additional instructions during the session. The platform will be used to gather written responses which would funnel in from participants simultaneously. Following the focus group discussion, a complete transcript will be exported from the iLab platform.

Trained staff from StratComm will moderate the focus group sessions. The moderator will guide the group discussion following a list of pre-determined discussion topics and encouraging responses. Participants will be asked questions about the type of contact methods by which they would prefer to be contacted, and about the effectiveness of the type and content of proposed recruitment materials. The moderator’s guide for conducting the focus group sessions, including the list of discussion topics, is provided in Attachment VII, and the materials to be reviewed are included in Attachment VIII. Each focus group session will last for a maximum of 90 minutes, and participants will be offered $50 check or Amazon e-gift card to thank them for their time and participation.

Immediately following the conclusion of each session, the session transcript and notes will be reviewed, highlighting potential themes that may have arisen. A complete transcript will be archived for qualitative analysis. StratComm will organize session observations and summarize into a report the common themes, insights, and ideas emerging from each of the sessions.

Opinion Survey with Students Using Online Crowdsourcing Platform (Attachment IX)

An opinion survey to evaluate the appeal of data collection materials and various incentive options will be conducted using a web survey administered to participants recruited through Amazon’s MTurk online crowdsourcing platform and referred to as “workers.” The MTurk recruitment post to be used is provided in Attachment IX. Qualified “workers” based on MTurk’s premium qualifications, such as education level, will be able to view the request upon logging into their worker account. We will use the premium qualifications to recruit respondents based on reported education level – currently enrolled in a college, university, or trade school or at least some college – and a prior approval rating of 95% (see Attachment IX). MTurk worker approval ratings are based on the percent of work accepted by other requesters. Approximately 1,200 of those determined eligible will be surveyed via a web instrument (the survey facsimile is provided in Attachment V) that can be completed on any desktop or mobile device that is web-capable, from any location. The survey will require up to 15 minutes to complete. Common practice for MTurk workers is to set a rate based on the federal minimum wage of $7.25/hour for their participation. Thus survey respondents will be paid $2.75 for a completed survey. Attachment V contains the survey questions; Attachment IX contains the MTurk information.

The content planned for the survey tryouts, focus group follow-ups, and focus group assessments of recruitment materials will be used for the opinion survey. The results of the opinion survey will be used to augment the survey tryouts and focus groups qualitative research described above.

Focus Groups and Usability Testing with Institution Staff (Attachments X-XVIII)

Focus Groups

Focus group participants will be recruited from institutions that completed the NPSAS:18-AC student records collection, and will be the individuals responsible for collecting and providing the requested data for their institution. Potential participants will be contacted by telephone and screened to ensure they are responsible for completing or overseeing completion of the student records collection. Once the individuals have been screened, they will be invited to participate in one of the focus group sessions. Up to five focus group sessions will be scheduled, with up to 10 participants in each session, for a total of 50 institution staff. To achieve that number, up to 250 institutions will be contacted. Each session is expected to last approximately 90 minutes. To the extent possible, participants will be selected so as to ensure that a range of institution sizes and types are represented.

Recruiters will collect the e-mail addresses of those who agree to participate and will send a confirmation e-mail with information about the study and the date and time of the participant’s focus group. A link to the online video interface for the focus group and log-in information will be provided in another e-mail the day before the focus group session. Attachment X provides the recruitment procedures and materials that will be used for contacting institution staff; Attachment XI the screening questions that will be used to determine eligibility for participating in focus groups; and Attachment XII the consent form that all participants will be asked to sign before their focus group session begins.

Participants will need to have access to a computer with high speed internet at the location from which they will be participating in the focus group. Because all sessions will be conducted remotely through an online video, participants’ computers must have a webcam. Participants who do not have access to a webcam will be provided with one for use during the focus group session. Right before the focus group begins, a technical check will be performed to ensure that they can log into the focus group interface and that all equipment, including the webcam, is functioning properly. At the conclusion of the focus group session, participants will return the webcam using a postage-paid package provided by EurekaFacts.

Trained staff from EurekaFacts will remotely moderate the focus group sessions. The moderator will guide the group discussion, following a list of pre-determined discussion topics. Participants will be asked about the timing of the data requests, the instructions for providing data, item definitions, and their experiences using the PDP and student records instrument for NPSAS:18-AC. The protocol for conducting the focus group sessions, including the list of discussion topics, is provided in Attachment XIII. Each focus group session will last for up to 90 minutes, and participants will be offered a $50 Amazon gift card to thank them for their time and participation.

Audio and video recordings of each interview will be available to NCES for review. Immediately following the conclusion of each session, methodologists will review the session recordings and notes, highlighting any themes that have arisen, and archive the digital audio recording for qualitative analysis. EurekaFacts will organize the observations and summarize the common themes, insights, and ideas emerging from each of the sessions into a report that will be submitted to NCES and to RTI’s NPSAS:20 project staff.

Usability Testing

For usability testing of the PDP, up to 30 sessions will be scheduled. Recruiting and screening procedures for usability testing of the PDP will be the same as those described above for focus groups with institution staff. Up to 150 institutions will be contacted to recruit 30 participants. Each session is expected to last approximately 90 minutes and participants will be provided with $50 to thank them for their time and participation. The PDP usability testing sessions will be conducted remotely while participants use their own computers to navigate the PDP website. EurekaFacts’ usability experts will observe participants’ ease of navigation through the PDP site and will probe about their experiences providing data and finding the information they need.

The usability testing will be conducted using webcams. Participants who do not have access to a webcam will be provided one for use during the testing session. Each session will be conducted through an audio and video connection as both the participant and session facilitator view the PDP website. Observers will be able to listen to the session, watch the participant’s reactions, follow the participant’s screen as they navigate the PDP website, and listen to the debriefing.

Attachment XIV provides the recruitment procedures and materials that will be used for recruiting usability testing participants; Attachment XV the screening questions that will be used to determine eligibility for participating in the sessions; and Attachment XVI the consent form that all participants will be asked to sign before the sessions begin. The protocol for conducting the usability testing sessions, including the list of discussion topics, is provided in Attachment XVII. The draft NPSAS:20 student records instrument is provided for reference in Attachment XVIII. The programmed instrument may be presented as a visual aid to facilitate discussion, but the instrument will not be administered in its entirety during the session. The content of the PDP website is provided in Attachment XIX.

Estimated Respondent Burden

The estimated respondent burden for each testing method described above is provided in table 1.

Table 1. Estimated respondent burden


Number of respondents

Number of responses

Minutes per respondent

Total burden hours

Survey tryouts and focus groups with students





Screening

4,000

4,000

3

200

Remote survey tryout

250*

250

25

104

In-person survey tryout with focus group

50*

50

90

75

Testing total

4,000

4,300

-

379

Focus groups with students to assess study recruitment materials





Recruitment/screening

250

250

4

17

Focus group participation

50*

50

90

75

Testing total

250

300

-

92

Opinion survey with current/recent students using MTurk





Survey

1,200

1,200

15

300

Testing total

1,200

1,200

-

300

Focus groups with institution staff





Recruitment

250

250

4

17

Focus group participation

50*

50

90

75

Testing Total

250

300

-

92

Usability testing of the PDP with institution staff





Recruitment

150

150

4

10

Usability testing participation

30*

30

90

45

Testing total

150

200

-

85

Study total

5,850

6,280

-

918

* Subset of the screened group.

Estimate of Costs for Recruiting and Paying Respondents

In order to be able to recruit a representative range of respondents, both students and institution staff who have similar characteristics as those who will be part of the NPSAS:20 national data collection, and to thank them for their time and participation, we will offer the level of incentives shown in table 2 for each testing method. The amounts shown are consistent with amounts offered in similar studies. The “worker” payment indicated for the MTurk survey is considered a typical rate for similar tasks.

Table 2. Incentive offered per respondent, by testing method

Method

Amount

Survey tryout with students

$30.00

Survey tryout with follow-up focus group with students

$90.00

Focus group with students assessing recruitment materials

$50.00

Opinion survey with students using MTurk

$2.75

Focus group with institution staff

$50.00

Usability testing of the PDP with institution staff

$50.00

Estimate of Cost Burden

There are no direct costs for respondents.

Cost to Federal Government

The cost to the federal government for conducting the pretesting activities requested in this submission is approximately $526,213. This cost includes recruitment and screening, administering focus groups, analyses, report writing, and participant incentives.

Assurance of Confidentiality

Survey tryout, focus group, and usability testing participants will be informed that their participation is voluntary and that:

RTI International and [its subcontractors] are carrying out this research for the National Center for Education Statistics (NCES), part of the U.S. Department of Education. NCES is authorized to conduct this study by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

All respondents will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all materials for each respondent together. The respondent ID will not be linked to the respondent’s name. Respondents will be given a consent form via e-mail or in-person, which they will need to sign and return in order to be able to participate. The signed consent forms will be kept separately from all survey data and focus group files for the duration of the study and records will be destroyed after the final report is completed.

Schedule for NPSAS:20 OMB Requests and Related Activities

Recruiting for the survey tryouts, focus groups, and usability testing will begin upon OMB approval.

Recruit participants

January – May 2019

Conduct survey tryouts and focus groups

January – June 2019

Conduct data collection materials focus groups

January – June 2019

Conduct online opinion survey

January – March 2019

Conduct focus groups with institution staff

January – June 2019

Conduct usability testing with institution staff

January – June 2019


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy