Supporting Statement B - Survey of Higher Education Instructors to Support the Development of Teaching Materials Based on OPRE’s

Generic Clearance - DTM Instructor Survey - SSB - clean.docx

Formative Data Collections for ACF Program Support

Supporting Statement B - Survey of Higher Education Instructors to Support the Development of Teaching Materials Based on OPRE’s

OMB: 0970-0531

Document [docx]
Download: docx | pdf

Alternative Supporting Statement Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes




Survey of Higher Education Instructors to Support the Development of Teaching Materials Based on OPRE’s Research and Evaluation



Formative Data Collections for Program Support


0970 - 0531





Supporting Statement

Part B

JUNE 2020


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:
Alysia Blandon

Shirley Adelstein

Wendy DeCourcey

Part B


B1. Objectives

Study Objectives

This information collection, consisting of a one-time, web survey of postsecondary instructors, will support the creation of teaching materials based on the Administration for Children and Families Office of Planning, Research, and Evaluation’s (OPRE’s) research and evaluations. The information collection has the following objectives:

  1. To contribute to OPRE’s understanding of the academic audience for teaching materials;

  2. To describe the instructional needs of this audience; and

  3. To inform how to meet the instructional needs of this audience through developing and disseminating teaching materials based on OPRE’s work.


Generalizability of Results

This study is intended to provide descriptive information about potential users of OPRE teaching materials (postsecondary instructors teaching in a set of relevant disciplines). This information collection is not intended to promote statistical generalization within or beyond the target population.


Appropriateness of Study Design and Methods for Planned Uses

We will not use the data to make statistical inferences or generalize findings beyond the study sample. However, the information gathered from the nonrepresentative sample will provide a snapshot of instructor characteristics and needs that will guide plans for creating and disseminating the teaching materials.


As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.  


B2. Methods and Design

Target Population

We will collect information from higher education instructors with diverse experiences to ensure diverse perspectives are represented. We will recruit a minimum of 180 higher education instructors:

  • across nine relevant academic disciplines (public policy, social work, public health, economics, sociology, human development, education, psychology, and research methods/evaluation)

  • teaching at different types of institutions (e.g., those that do and do not have graduate programs)

  • at different career stages

For each discipline we identified a minimum number of participants that we are seeking (N = 20). Within each discipline, we have minimum targets for respondents from graduate and baccalaureate institutions, as well as early career academics (i.e., assistant professors, postdoctoral fellows, graduate students, and non-tenure-track faculty with less than seven years of teaching experience) and late career academics (i.e., associate professors, full professors, and non-tenure-track faculty with more than seven years of teaching experience). Minimum response targets for institution type and career stage within each discipline are presented in Exhibit 1.



Because the survey utilizes open recruitment, it is possible we will exceed our minimum targets.


Exhibit 1. Minimum response targets for key characteristics within discipline


Institution Classification

Baccalaureate (≥5)

Graduate (≥8)

Career Stage

Early Career (≥8)

Late Career (≥5)


Recruitment

Recruitment will be done in two phases:


Recruitment Phase 1

Open Recruitment. We will solicit survey responses through online communication channels using an open survey link. We plan to post requests via electronic mailing lists or discussion forums; informal Facebook group pages (for example, the Shared Resources for Teaching Sociology page); and relevant hashtags of chats on Twitter that are likely to reach the targeted audience (for example, #APPAM2020, related to the annual meeting of the Association for Public Policy Analysis and Management).


To prepare for Phase 1 we will build a comprehensive list of online communication channels, including electronic mailing lists, discussion forums, and social media sources. We will tailor the channels on this list to reach a wide range of instructors (for example, varied career stage, instructors at minority-serving institutions, and instructors within the target disciplines). To ensure that we capture a variety of responses, we will focus our efforts on channels intended for general audiences within disciplines (such as the American Psychological Association’s listserv), those that cater to identified subdisciplines (such as the listserv for the Society for the Psychological Study of Social Issues), and those that are likely to reach scholars of color. See Appendix B for a detailed list of examples of recruitment sources and online communication channels.


We will post the web survey link on the identified online communication channels twice: once at the beginning of data collection (estimated August 2020) and again about a month later (estimated late September 2020). We will use real-time data to monitor progress on meeting our response targets.


Newsletter. OPRE will also include an announcement about the survey in the Office's regular newsletter, clearly describing who is eligible to complete it. A link to the survey will be provided.


Recruitment language for Phase 1 communications, including e-mail text, social media messages, and an OPRE newsletter item are included in Appendix A.


Recruitment Phase 2

One week after the second Phase 1 posting to online communication channels, if we are not on track to meet the targets for Phase 1, we will draw a purposive sample in preparation for Phase 2 of recruitment. The following week (two weeks after the second Phase 1 posting), we will assess if we are likely to have any shortfalls to the minimum number of responses in any categories (e.g., discipline). We will also look at the racial and ethnic breakdown among respondents to confirm that participants are racially and ethnically diverse.


For most disciplines where shortfalls are observed, we will randomly select institutions from the Carnegie Classification database, which categorizes colleges and universities according to the degrees they confer (e.g., doctoral, masters, baccalaureate, associate’s). Depending on where shortfalls are observed in institution classifications within disciplines (see Exhibit 1), we will stratify by classification. For education (which, for the purposes of this project, has a focus on early education), we will use a modified approach, randomly selecting programs included in the National Association of Young Children degree database rather than the Carnegie Classification database.


For each selected institution, we will identify departments for up to two disciplines for which we have not met our minimum number of responses. For each selected department, we will select up to four faculty members listed in the departmental directory to receive an invitation to complete the survey. If necessary, we will stratify individuals by academic rank (see Exhibit 1) and select individuals only from strata needed to meet the minimum number of responses for early and later career academics. We will select faculty from specific sub-disciplines when appropriate and available (e.g., in larger departments).

Potential participants will receive two invitation e-mails sent one week apart (Appendix A includes language for Phase 2 recruitment e-mails).


We expect that recruitment in Phase 2 will be an iterative process. We will send out a first round of invitations to two individuals from each selected department. If the first round of Phase 2 invitations does not result in meeting recruitment targets, we will contact additional selected participants until targets are met. We will release one additional case from each selected department for which quotas have not been met in each of the two following weeks.


When we reach the targeted number of responses within each discipline, by institution type and career stage, we will officially end all recruitment efforts. We will consider a survey complete if the respondent responded to at least 80 percent of the entire questionnaire without regard to item missingness. After the survey fielding period has ended, any instructors who attempt to access the survey link will receive a notice that the survey is closed.


B3. Design of Data Collection Instruments

Development of Data Collection Instrument

The survey instrument (Instrument 1) was developed by the project team to meet project goals and answer the identified research questions. We first identified constructs we wanted to capture and scanned existing survey instruments, such as the Harvard Faculty Experiences Survey, for measures to adopt or adapt. When relevant items were not available, we created items to capture information specific to the project. The items are primarily closed-ended.


In order to avoid measurement error from a variety of sources (e.g., leading or unclear questions and missing response options from closed-ended items), we revised the survey based on feedback we received from academic experts who reviewed the draft survey and a pilot test. We pilot tested the survey with 8 instructors, representing 8 of the disciplines in our target population. These instructors completed the survey, noting how long it took them, annotated the instruments with any concerns or suggestions, and provided more general feedback on a feedback from.


B4. Collection of Data and Quality Control

This web survey will be fielded through an on-site (at Mathematica, a sub-contractor on the study team) installation of Confirmit, a survey software platform that facilitates efficiently building and launching Section 508-compliant surveys while minimizing respondent burden. Confirmit is optimized for mobile devices. When the instrument is programmed, Mathematica staff will validate skip patterns and valid response options using automated testing. Staff will also provide quality assurance, test functionality across devices, and retest the survey using predefined scenarios (for example, one path for those who are familiar with OPRE and another path for those who are not) to validate survey length.


The survey is a self-administered (web only) data-collection, so interviewer training is unnecessary. The quality of the instrument was evaluated through three rounds of review by Mathematica and MEF project staff, internal QA staff assigned to the project, and OPRE. It was then pilot tested for comprehension on a sample of 8 faculty.


As a part of data processing we will examine IP addresses of respondents and exclude responses that come from non-US IP addresses and multiple responses that come from non-Institutional IP addresses. We will also identify and remove respondents who provide unlikely or incoherent responses, or who provide open text responses that do not pertain to the topics contained within the questions.


All recruitment communications (i.e., listserv, forum, and social media postings; e-mails) will include a direct link to the Confirmit survey. Respondents must complete the survey in one sitting; if they leave, they cannot pick up where they stopped.


B5. Response Rates and Potential Nonresponse Bias

Response Rates

The survey is not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported.


NonResponse

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in internal materials associated with the data collection.


B6. Production of Estimates and Projections

Findings from the instructor survey are for internal OPRE use only. The data will not be used to generate population estimates, either for internal use or dissemination. We will only produce descriptive statistics for the sample and, potentially, subgroups of interest to OPRE, and this information will be for internal informational purposes only.


B7. Data Handling and Analysis

Data Handling

Confirmit provides real-time logic checks and access to data frequencies that will make data review efficient. After the first phase of data collection, project staff will review data frequencies to make sure the responses remain internally consistent. We will also use survey meta-data (such as user agent strings) to identify and remove duplicate, mostly incomplete, or obviously malicious responses. After data collection has ended, we will further check data frequencies and make logical edits when necessary. We will examine if there are systematic patterns in non-response, and we will recode open-ended survey responses into valid response options as appropriate.


Data Analysis

For closed-ended survey questions, we will conduct descriptive analysis, calculating frequencies and means, as appropriate. We will report descriptive statistics for recruitment source (the specific website, Phase 1 e-mail, or Phase 2 e-mail), respondent characteristics (including demographics), as well as users’ behaviors, needs, and motivations related to the development of teaching materials. Subgroup analysis will be potentially informative for this project as overall means will be influenced by the size of different subgroups (for example, the number of instructors who complete the survey from each discipline or the number who are familiar with OPRE). We will potentially identify select key subgroups of interest. Subgroups might include discipline, instructor career stage, course level(s) taught (undergraduate or graduate courses), or familiarity with OPRE. For example, we could examine the kinds of teaching materials preferred by those who report teaching graduate courses, which would inform how to tailor teaching materials.


We will code open-ended responses (that is, other-specify and open-text responses) for up to 180 respondents. A trained analyst will code the qualitative responses, starting with a basic coding scheme that will evolve to capture the major categories and subcategories observed in the data. Other-specifies will be examined and recoded into existing categories where appropriate. The names of sources of teaching materials listed by respondents will be standardized so they can be tabulated. Needs or gaps in teaching materials will be organized according to theme. This is an exploratory analysis, so the specific themes will depend on participant responses, but could include types of content, methods of content delivery, target audiences or other design features. An initial coding scheme, aligned with the research questions (A2), will be developed, in which the analyst will be trained. Thereafter, if additional codes are warranted, the task lead and the analysis will refine them. We will assess inter-rater reliability based on dual coding of 10 percent of surveys. We will resolve any disagreements and update the coding scheme as needed. If the agreement rate is below 85 percent (assuming binary variables), we will conduct additional training and dual code an additional 10 percent of surveys. Once coded, responses will be counted and analyzed descriptively. In addition, anecdotes illustrative of themes may be included in the report.


Data Use

The data and all reports based on it will be for use by the study team, including the contractors and OPRE. Selected findings may be shared publicly to demonstrate how the project’s teaching materials were developed in response to information collected from instructors. Any data shared publicly will be aggregated across all respondents or subgroups of respondents.


B8. Contact Person(s)

Principal Investigator

Shannon Monahan

Deputy Director for Surveys and Data Collection (Children, Youth, and Families)

Mathematica

[email protected]


Project Director

Asaph Glosser

Principal Associate

MEF Associates

[email protected]


Project Manager

Kate Stepleton

Senior Research Associate

MEF Associates

[email protected]


Attachments

Instrument 1: Develop Teaching Materials: Faculty and Instructor Survey

Appendix A: Survey Outreach and Invitation Language

Appendix B: Examples of Open Recruitment Sources and Online Communication Channels

Appendix C: IRB Approval

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKate Stepleton
File Modified0000-00-00
File Created2022-01-07

© 2024 OMB.report | Privacy Policy