2017 IPEDSTime and Burden Study Report

2017 IPEDS Time and Burden Cognitive Study Report.docx

Integrated Postsecondary Education Data System (IPEDS) 2017-18 through 2019-20

2017 IPEDSTime and Burden Study Report

OMB: 1850-0582

Document [docx]
Download: docx | pdf


Shape1 Shape2
























2017 Integrated Postsecondary Education Data System Time Use and Burden Cognitive Interviews: Final Report

Prepared for the National Center for Education Statistics by

Sidney Wilkinson-Flicker, Kathryn Low, Sarah Hein, Aida Aliyeva, and Christopher A. Cody






2017 Integrated Postsecondary Education Data System Time Use and Burden Cognitive Interviews: Final Report


May 2017



Sidney Wilkinson-Flicker, Kathryn Low, Sarah Hein, Aida Aliyeva, and Christopher A. Cody























1000 Thomas Jefferson Street NW Washington, DC 20007-3835

202.403.5000 | TTY 877.334.3499

www.air.org


Contents



Page

Introduction

American Institutes for Research (AIR) was contracted by the National Center for Education Statistics (NCES) to interview administrators from postsecondary institutions in order to better understand the time use and burden for institutions participating in Integrated Postsecondary Education Data System (IPEDS) data collections.


Data collected through IPEDS are used in numerous public-facing data tools that facilitate the identification and comparison of postsecondary institutions for policy makers, researchers, and prospective students. As such, over time, data gathered through IPEDS have become increasingly important to and ubiquitous within the postsecondary industry. This may have resulted in institutions taking additional time to review and consider their data before submitting them to IPEDS for the latest round of data collection. In order to provide accurate time use and burden estimates for IPEDS data collections, NCES is considering developing a revised question or a short set of questions to collect accurate estimates of the time and burden involved in reporting data to IPEDS.


In an effort to better understand an institutions time and burden, AIR conducted two rounds of cognitive interviews of administrators from 48 postsecondary institutions. The first round of interviews was designed to gain a better understanding of respondents’ time use and burden for completing the 11 IPEDS survey components. Questions in the Round 1 interviews were intended to explore respondents’ understanding of the current IPEDS time use and burden question—particularly, what respondents include and exclude from their calculation when answering the question—in order to determine whether they are providing the information that the question was designed to extract.


Results from the Round 1 interviews and observations suggest that respondents are not consistent in how they report time use and burden related to completing IPEDS survey components. Based on these findings, a new series of time use and burden questions were drafted and tested during a second round of cognitive interviews. Round 2 interviews included questions regarding the initial collection of the data that are reported to IPEDS, the number of people involved in the data collection and reporting process, the steps making up the data collection and reporting process, and the initial purpose of the data collection.


This report documents the key findings from both rounds of cognitive interviews as well as recommendations for improving IPEDS collection of institutional time use and burden estimates.


Methods

In both rounds, each cognitive interview lasted approximately one hour and was conducted using a structured protocol. AIR developed the methodology in consultation with NCES, drawing on best practices and methods from cognitive science. The interviews during Round 1 were designed to explore respondents’ understanding of the current IPEDS time use and burden question and what factors determine their answers to the question, in order to assess whether respondents provide the information that the question is intended to gather. See Appendix A for the protocol used in Round 1. In addition, interviews in Round 2 included questions designed to identify problems of ambiguity or misunderstanding in possible alternative time use and burden questions. See Appendix B for the protocol used in Round 2.


Respondents were invited to participate in the study via either in-person meetings or remote interviews using the web conferencing service GoToMeeting. The interviews were conducted using two key methods: think-aloud interviewing and verbal probing techniques. In think-aloud interviews, respondents were explicitly instructed to think aloud (i.e., describe what they were thinking) as they worked through question items presented by the interviewer. In verbal probing, the interviewers asked questions to clarify, as necessary, responses from the “think-aloud” process that may not have been clear, or to explore additional issues identified as being of particular interest. The verbal probes included a combination of pre-planned and ad hoc questions. Pre-planned questions were item-specific questions identified before the interview session as important. Ad hoc questions were those identified as important by the interviewer as a result of observations during the interview (e.g., clarification or expansion on points raised by the participant).


Sample

Cognitive interview participants were recruited from a list of possible institutions provided by NCES. Due to the convenience of the remote interviews, the majority of respondents chose to have their interviews conducted remotely. However, three respondents agreed to be interviewed in person since their institutions were located in the Washington, D.C., metro area. Respondents did not receive any incentive for their participation.


Participants were recruited based on their institutional level and size. AIR staff conducted a total of 48 cognitive interviews with administrators who were either directors or other staff members of their respective institution’s instructional research office (or the equivalent) between February and April 2017. The interviews were conducted in two rounds to facilitate the creation of alternative time use and burden questions. For additional details regarding the institutions involved in the study, see Appendix C.


Table 1. Characteristics of Participants in the 2017 IPEDS Time Use and Burden Study


Round 1

Round 2

Total

24

24

Level of Institution

Less than 2-year

5

6

2-year

8

11

4-year

11

7

Size of Institution

Small

6

7

Medium

13

11

Large

5

6

Source: NCES, IPEDS Data Center, 2015–16



Key Findings from Round 1

Key findings from Round 1 included insights into institutions’ processes for submitting IPEDS data, how respondents approach answering the current time use and burden question, and respondents’ thoughts on the burden of the IPEDS data collection. See Table C-1 in Appendix C for characteristics of institutions that participated in Round 1.

Due to questions raised during Round 1, AIR asked NCES to clarify which staff should be considered when calculating the answer to the time and burden question. NCES indicated institutions should count the time and burden of institutional staff only. State or central office support should not be included. We kept this in mind when rewording the Round 2 alternate questions.


Data Submission Process


For the majority of respondents, the data submission process began when IPEDS sent notification to the keyholder that the collection for a particular component was open. Most respondents indicated that the majority of the survey components are handled within their department. However, there are certain survey components—Finance, Academic Libraries, Student Financial Aid, and Human Resources were the components most commonly cited—that require content-specific knowledge or restricted access to specific data. In these cases, the keyholder contacts staff at the relevant department within their institution, who in turn will complete the component independently or will return the required data to the keyholder for entry.

Centralized Systems and State Assistance


Approximately 10 respondents noted that they were members of a centralized system or that their state uploaded institutionally provided data into IPEDS when appropriate. Many institutions located in Virginia and South Carolina were part of centralized community college systems that would provide the necessary data for the institution and would even at times upload the data into the IPEDS data collection website. For institutions located in certain states, the state would upload data for certain components based on institutionally provided state data reports. For example, Maryland uploads data for the Fall Enrollment and Completions survey components while Florida uploads the 12-month Enrollment and Completions survey components.


Respondents noted that while they may have needed to make minor revisions to their data in order to comply with IPEDS requirements, in these situations their main IPEDS responsibility was limited to verifying the data the centralized office or state uploaded to the system and addressing any errors or questions that emerged. As a result, the burden on these institutions was lower than the burden on institutions with no centralized systems or state assistance. In these cases, the institutions tended to understand they should exclude the time spent on compiling data by members of the centralized system or of their state that were outside their institution.

IPEDS and Other Reporting Opportunities

Throughout Round 1 of the study, respondents noted that there are times when data reported to IPEDS have already been utilized for other reporting requirements to state or accrediting organizations. One respondent noted that if she were to include the time spent preparing the data for the state with her IPEDS estimate, her time use and burden estimate would increase by approximately 200 hours, from 680 hours (only IPEDS reporting) to 880 hours (IPEDS reporting plus state reporting). Respondents noted that in such situations they would include only the time spent exclusively on IPEDS in their time use and burden estimate. This decision to exclude the respondent burden for non-IPEDS collections results in a lower estimate since much of the time spent preparing the data is included in the earlier non-IPEDS reporting burden. Additionally, while a few respondents noted that some data was used across collection, they also noted that the data required modifications due to IPEDS requirements. As a result, respondents would not include the time spent on the initial report when estimating the time and burden for the IPEDS collection.


Reporting Time Use and Burden

Current Time Use and Burden Question and Instructions

Figure 1. Current Time Use and Burden Question


When questioned about the current time use and burden question, very few respondents misinterpreted the current time use and burden question. The majority of respondents noted the question was asking about how much time it took them to complete the survey component from start to finish. Activities mentioned by respondents included examining the component and its instructions for any revisions or additions, as well as gathering, reviewing, and locking/submitting their data.


Immediately following the current time use and burden question is text that includes an explanation of why IPEDS is asking the question and what the respondents should include in their estimate. The activities respondents described in the prior question aligned with the activities that are included in the text (below) that follows the current time use and burden question:

The time it took to prepare this component is being collected so that we can continue to improve our estimate of the reporting burden associated with IPEDS. Please include in your estimate the time it took for you to review instructions, query and search data sources, complete and review the component, and submit the data through the Data Collection System.”


When asked about the instructions, 10 out of 211 respondents stated that they recognized the text. However, 8 of those 10 respondents either could not recall the details or could not remember reading the instructions. Many respondents noted that by the time they reach this section of the component they are driven to finish and as a result do not always read every word.


When respondents were asked about the use of the word “you” in the second sentence of the instructions, about half of the respondents took the word at face value and stated that it meant “me,” which was not a correct interpretation based on the parameter set by NCES, while the remaining respondents correctly interpreted this to mean their institution, university, or a wider group of staff. In order to clarify that “you” was intended to mean a more inclusive “your institution,” AIR staff explored alternatives to “you” during Round 2 of this study (see the Round 2 Current Time Use and Burden Question section).

Reporting Time and Burden


When asked about their reporting methods for the time use and burden question, the majority of respondents noted they were unable to report the exact amount of time spent on the IPEDS data collection process and that their responses were estimates. One respondent noted that he kept a spreadsheet to record the time he spent on each IPEDS component and stated he was confident that his estimate would be within 5–15 minutes of the actual time spent. Most respondents noted that since IPEDS is not their only task, time spent on the IPEDS components is frequently interrupted.


Only 3 out of 24 respondents noted they include time of other institutional staff involved when reporting time and burden for a component. In general, those who include time of other staff in their estimates directly asked staff how much time they spent working on the component and added that time to the overall estimate for their institution.


Burden of the IPEDS Collection

In the concluding section of the Round 1 protocol, respondents were asked to report how burdensome they found reporting for the last full round of IPEDS (2015–16) to be, to explain their reasoning, and to estimate the total time it took them to complete IPEDS 2015–16. See table C-3 in Appendix C for a summary of respondents’ answers.

The first question in the Conclusion section of the Round 1 interview protocol asked respondents how burdensome it was to respond to the IPEDS collection, on a scale of “not at all burdensome,” “a little burdensome,” “somewhat burdensome,” or “very burdensome.” Fifteen respondents indicated that the IPEDS collection was “somewhat burdensome” or “very burdensome,” 8 out of 23 respondents indicated that the IPEDS collection was “not at all burdensome” or “a little burdensome,” and one respondent was not asked this question.

A follow-up probe asked respondents to explain their response, which provided insights into how respondents’ burden is affected by factors both inside and outside NCES’s control.

  • Among the respondents who indicated higher burden (i.e., “somewhat burdensome” or “very burdensome”), the reasons provided tended to include the recently added Outcome Measures survey component, a lack of resources within their institution dedicated to the IPEDS collection (e.g., number of staff, current computer software is not able to generate IPEDS-ready reports), or instructions which were initially unclear or did not apply to the respondent’s institution.

  • Among institutions who reported lower burden (i.e., “not at all burdensome” or “a little burdensome”), the reasons provided tended to include availability of resources (e.g., commercial or in-house computer systems able to produce IPEDS-ready reports internally) or support from a centralized system office or state assistance.

The second question in this section asked respondents to estimate how much time they spent on all of the IPEDS survey components throughout the year. Responses ranged from less than an hour to 880 hours. Common factors that respondents volunteered were whether the respondent

  • included the hours of others who assisted in the IPEDS collection process

  • had electronic or automatic processes already in place to assist with the IPEDS collection process

  • reported for multiple campuses

  • included certain parts of the data collection process (e.g., data cleaning)

  • had familiarity with IPEDS

It is important to note that a response to the first question on how burdensome it was to respond to the IPEDS collection did not necessarily predict an answer to the second question on how much time it took to complete all IPEDS survey components. For example, responses to the second question from those who reported that the IPEDS collection was “somewhat burdensome” included less than an hour, 23 weeks, 150 hours, and 175200 hours for a single institution.

Many respondents also volunteered near the conclusion of the interview that they did not know why reporting time use and burden to IPEDS should matter to them. Some respondents felt that the time use and burden question was not important since the survey is mandatory and considered standard in the postsecondary industry. As a result, respondents felt that it must be completed regardless of how long it takes them to complete the components. Subsequently, they did not keep track of their efforts to complete the IPEDS collections and so are unable to accurately answer the question. Some also skipped the time and burden questions because it was not required and their primary focus was to report accurate data.


Recommendations from Round 1

Results from Round 1 interviews and observations revealed that respondents were confused as to what exactly NCES would like to be included in their answer to the current time use and burden question. Respondents were unsure exactly which activities should be included (e.g., time spent on other reports) and who should be included (e.g., staff in their department, staff in other departments, staff in centralized system offices). During Round 2 interviews, AIR staff tested the addition of “your institution” to the current time use and burden question and replacing “you” with “your institution” in the supporting text in order to see if this reduced the confusion and broadened respondents' understanding of who should be included.

Additionally, AIR staff drafted a question that explicitly asked if data used for IPEDS were used in other reports. Asking this question might be able to assist in understanding the reported time estimates, which may exclude the burden being allocated to prepare non-IPEDS reports.

In order to address the issue of respondents not including time spent by other staff in their time use and burden answers, AIR staff drafted a set of two questions that will remind and lead respondents to include other staff’s time use and burden when answering the question. These questions will ask if the respondents worked with other departments as well as the number of staff involved in the IPEDS data collection and reporting process.

Lastly, AIR staff drafted a question that will result in more accurate answers to the time use and burden question. This question will break out the different steps required to complete an IPEDS component from start to finish as well as provide space for both the respondent’s office and other offices involved in the process.

See Appendix B for the full protocol used during Round 2. Drafted alternative questions can be found there and in the discussion of findings below.


Key Findings from Round 2

Key findings from Round 2 are described below. For this round a PowerPoint presentation was prepared in order to share visual depictions of the current and drafted alternative time and burden questions from the Round 2 protocol. During the interviews this presentation was shown to respondents using screen share capabilities and they were asked to read and think-aloud as each question was reviewed.

See Table C-2 in Appendix C for characteristics of institutions that participated in Round 2.


Current Time Use and Burden Question


In Round 2, similarly to in Round 1, all respondents were asked to provide answers to the current time use and burden question for one of the components they were personally responsible for. Additionally, respondents were asked to describe what processes they included in their time use and burden calculation. Respondents were then presented with a modified version of the current time use and burden question: “How long did it take your institution to prepare this survey component?” Respondents were asked if they would respond to the modified question differently than they would to the current question. The majority of respondents replied they would interpret whose time and burden should be reported more broadly if presented with the modified question. Respondents who currently do not report any burden other than their own commented that this question modification would increase the reported burden estimates for their institution as they would include other institutional staff who provide assistance for IPEDS reporting. This revision would result in the respondents interpreting the question accurately according to the NCES guidelines of including all institutional staff that work on the IPEDS collections.


Additionally, respondents were asked if they recalled seeing and reading two sentences currently found below the time use and burden question:

The time it took to prepare this component is being collected so that we can continue to improve our estimate of the reporting burden associated with IPEDS. Please include in your estimate the time it took for you to review instructions, query and search data sources, complete and review the component, and submit the data through the Data Collection System.”


While several respondents indicated that they had noticed these sentences during the collection, the majority responded that they did not read them prior to completing the time use and burden question. A follow-up question by interviewers asked if respondents would be more inclined to read the additional two sentences if they were placed above the time use and burden question. The majority stated they would be more likely to read the statement if it came before the question. Several respondents even suggested adding this text to the beginning of the survey to remind respondents of what they should include in their time use and burden tracking and reporting.


Respondents were also probed about their interpretation of this text. As with the current time use and burden question, many interpreted “you” to mean “me” and no one else, while others indicated that their estimates included time use and burden for other staff who participated in the IPEDS data collection and reporting. Respondents were asked if they would interpret the sentences differently if “your institution” replaced “your” in the second sentence. The majority responded, as with the revised time use and burden question, that they would be more inclusive in their reported calculations and would ask other staff who assisted with the component for their time use and burden estimates.


Alternative Time Use and Burden Reporting


A series of four alternative time use and burden questions were drafted to address feedback from respondents in Round 1 regarding who and what is included when answering the time use and burden question. Feedback regarding these alternative questions was provided by respondents from Round 2 of this study.

Question 1

Question 1: Were any data used in this survey component also collected for the following reporting purposes? Please mark “Yes” or “No” for each item below.

  1. State required reporting …..................... ○Yes ○No

  2. Internal institutional reporting …........... ○Yes ○No

  3. Other required reporting ….................... ○Yes ○No


Shape3


Alternative question 1 stemmed from the first round of interviews as some respondents reported that they use the same data for IPEDS and various other reporting requirements. Respondents from Round 2 had varying responses to this question. Some were confused by the question as they do not reuse IPEDS data for any other type of reporting due to differences in definitions between other reports and IPEDS. While other respondents had little difficulty understanding the questions as they reuse reported data to cut down on burden.


Institutions who were part of a centralized system were more likely to respond “Yes” to one or more of the options in alternative question 1, however most respondents answered “No” since they did not use identical data across different collections. Due to certain institutions’ participation in a centralized system, portions of data reported to IPEDS have been previously reported to college systems such as the Virginia Community College System. This reuse of data did not always result in double reporting of burden. Many respondents who were part of a centralized system were unsure whether they should also include the time and burden of staff from the centralized system office who assisted in the reporting of IPEDS data. These respondents also noted that they currently have no way of knowing the amount of time the staff from centralized office spent on preparing the data for upload. Based on guidelines set by NCES after Round 1, respondents are not to include external staff in their time and burden estimates.


Question 2


Question 2: Did you need to coordinate with staff in other offices to complete this survey component?

Yes

No



Shape4


Alternative question 2 was developed to prompt respondents to include time use and burden of everyone who assisted them in the completion of a component, as findings from Round 1 provided mixed results.


During Round 2 respondents reacted positively to this question. For some, it clarified that they should include time spent by other staff in their time use and burden estimates. For others, it confirmed that their inclusion of other staff’s time in their time use and burden estimates was the correct course of action. Again, respondents from institutions who are part of a centralized system questioned if they should include the time use and burden of those at the central office who assisted in IPEDS reporting because it was not clear if the “other offices” referred to “other offices” from within the institution only, or if it meant “other offices” including those outside the institution.

Question 3

Question 3: Including yourself, how many staff from your institution were involved in the data collection and reporting process of this survey component?

_______ Number of Staff

Shape5


Similar to alternative question 2, the purpose of alternative question 3 was to prompt respondents to include time use and burden of any additional staff who assisted in the completion of the IPEDS component. Based on feedback from a few respondents, this question could be used to expand on alternative question 2 or in place of.


Respondents reacted to alternative question 3 much like they did for alternative question 2. The majority of respondents interpreted this question to mean they should include the time use and burden of additional staff who assisted with the completion of IPEDS components. A few respondents from institutions that are part of a centralized system continued to wonder if they should still include time use and burden of the staff at the central office in their count even though this question specifically says “staff from your institution.” These respondents noted that their time and burden is minimized due to the time that staff from central offices contribute to completing the IPEDS collections. These respondents expressed that their contributions should be counted in some way but weren’t sure that this was the appropriate avenue. As a result of this uncertainty, as well as not being aware of the accurate time and burden of the centralized offices, these respondents did not include them in the time and burden estimates


Question 4


Question 4: Excluding the hours spent collecting data for state and other reporting purposes, how many hours did you and others spend on each of the steps below when responding to this survey component?


Staff member

Collecting Data Needed

Revising Data to Match IPEDS Requirements

Entering Data

Revising and Locking Data

Your office

______ hours

______ hours

______ hours

______ hours

Other offices

______ hours

______ hours

______ hours

______ hours


Shape6


Alternative question 4 was developed based on indications that keyholders and other IPEDS completers were not always including the time spent on all steps actually needed to complete an IPEDS survey component. Alternative question 4 breaks down the steps needed to complete a survey component in order to guide respondents to include the collective burden of all staff involved from the time they begin collecting the data through when they successfully lock their data. Based on responses from Round 1, some respondents noted that they only filled in the hour box of the current time and burden question since they could not be accurate with the exact number of minutes spent. As a result, alternative question 4 asks for hours but omits minutes.


Feedback from the majority of respondents on alternative question 4 was positive. The breakdown in steps reminded keyholders and other IPEDS survey completers to include time of all staff involved in the completion of the IPEDS component. Several respondents suggested rewording the question to varying degrees. Of those respondents who had a negative reaction to this question all cited complexity and/or length as an issue. Most respondents also noted the second portion of the original text was their primary concern. They felt that the situation of using data pulled for state and other reporting purposes did not apply to them and therefore should be removed entirely. However, since it may apply, we decided that it was best to keep the language in for those institutions where it does apply. As a result, the suggestion of moving the question to the front and adding the qualifier to the end was used to help with readability. The modified question is shown below:


How many hours did you and others spend on each of the steps below when responding to this survey component?

Please exclude hours spent collecting data for the state and other reporting purposes.


Additionally, based on the processes carried out by the various keyholders and other IPEDS survey completers, respondents made several suggestions to combine or reword the steps in alternative question 4. However, the majority of respondents felt the steps and division of offices were clear as currently laid out. Of those who were a part of a centralized system a few still wanted a way to indicate the burden of those who assisted them from the central office, but by this point in the interviews had realized they should exclude them.


With regard to providing smaller portions of time than hours, some respondents did provide fractions of hours when answering the questions (e.g., 1.5 hours). Additional instructions can be added depending upon how precise NCES would like keyholders to be (e.g., to the nearest quarter of an hour); however, AIR believes that adding a minutes option would make the question appear overly busy and could deter respondents from answering.


Question 5


Question 5: How would you rate your knowledge and skills in the following areas?


Mark (X) one box on each line.

Areas

Very poor

Poor

Fair

Good

Very good







Computer programming skills

IT Knowledge



Shape7


The final question asked respondents to rate their knowledge and skills in two specific areas. Following probing on the meaning of this question, respondents seemed to have a consistent understanding that the question was asking for the respondent’s level of skill in certain areas. However, some respondents wondered if they were to take other involved staff into consideration while answering this question, likely due to the emphasis placed on other staff throughout the rest of the interview. Additionally, the majority of respondents had different interpretations of the “Areas” currently included in the question. Many questioned why a keyholder or other IPEDS survey completer would need computer programming skills to complete IPEDS survey components. All respondents felt these two areas alone were not representative of the skills required to be a keyholder or other IPEDS survey completer. Suggested additions to the “Areas” included institutional knowledge, time management, analytical skills, computer application skills, and critical thinking skills, among many others.


Reactions to Alternative Questions


Overall, the majority of respondents reported that alternative questions 1–4 would require little to no burden to respond to. The majority also felt these alternative questions clarified the current version of the question on time use and burden, enabling better understanding of what to include and exclude from calculation of time spent completing the survey component. Respondents were hopeful that the slightly more detailed reporting of time use and burden would provide NCES with better estimates of the burden institutions bear when reporting for IPEDS.


With regard to alternative question 5, many respondents wondered why NCES wanted to record this type of information. They did not see the connection between this question and the other time use and burden questions.


Other Comments and Respondent Recommendations from Both Rounds of Interviews

Throughout the study, respondents would provide additional comments and feedback that were tangentially related to questions and probes included in the protocols. At the end of the interviews, respondents were asked if they had any additional comments or feedback about the time use and burden questions, IPEDS, or the data collection process that was not already covered.

During these situations in Round 1, a few respondents noted that they do not always see the time use and burden question – most citing the burden question in the Completions component as the one they miss most. The Completions component is set up differently from the other components. The burden section is found at the bottom of the cover page instead of at the end of the component so respondents are not seeing items in the order they are used to.

Additionally, a Round 2 respondent mentioned that since they complete the IPEDS components by direct uploads of the data, they do not see the current time use and burden question unless they click through the entire component manually. The respondent indicated if they were able to jump to the end of the component they would be more likely to complete the time use and burden question.

Additionally, a few respondents from both rounds mentioned that since the time use and burden question is not required, they would generally skip the question and move on to the next step in submitting their data. Other respondents noted that they preferred skipping the question since they cannot accurately estimate the time spent completing the survey component.

Many respondents were also concerned about IPEDS adding more survey components or more questions to current survey components, but this was generally discussed in isolation from the current time use and burden question; there appeared to be a disconnect between the responses to the current time use and burden question and the addition of questions in future IPEDS collections.


Summary and Final Recommendations


Throughout the interviews, it became clear that many respondents did not include time of other staff in their time use and burden estimates. Below is a range of possible revisions that could be made to improve respondents’ understanding of the current time use and burden question.


Recommendations for Revision

The immediate solution for improving respondents’ understanding of the time use and burden question includes two modifications to the current question and layout:

  1. Reword the existing question from “How long did it take to prepare this survey component?” to “How long did it take for staff from your institution only to complete this survey component?” The expectation is that replacing “you” with “staff from your institution only” will reinforce the fact that respondents should exclude time spent by staff that contribute to the institution’s estimates but they are not from within the institution itself (e.g., centralized offices).

  2. Move the current description of why IPEDS collects the amount of time it takes to prepare the survey component to appear above the current time use and burden question. Placing the instructions of what to include in their estimates in front of the question will remind respondents of the steps (i.e., reviewing instructions, querying and searching data sources, completing and reviewing the components, and submitting the data) that should be included in calculating their estimate.

The revised time use and burden estimate page would appear thus:

The time it took to prepare this component is being collected so that we can continue to improve our estimate of the reporting burden associated with IPEDS. Please include in your estimate the time it took for staff at your institution only to review instructions, query and search data sources, complete and review the component, and submit the data through the Data Collection System.”


How long did it take for staff from your institution only to complete this survey component?

_______ Hours

_______ Minutes


These minor revisions would result in greater respondent understanding of whom to include in their time use and burden estimates. However, AIR would recommend further clarifying exactly whose burden should be included in the time use and burden estimates. Throughout the study, respondents from institutions that were part of a centralized system expressed confusion on whether or not the time of non-institutional staff within the centralized system should be included. For example, one respondent who estimated the time use of the centralized non-institutional staff stated that including their time would triple her overall reported burden.

Additionally, some respondents noted that they were not always given the opportunity to answer the time use and burden questions. Multiple respondents noted that the time and burden question is not included in the Completions components. In another case, a respondent noted that since she imports her data in one step, she does not encounter the time use and burden question unless she clicks through the entire component on the IPEDS data collection website. Otherwise, she goes straight to the data lock step. AIR recommends revising the webpage structure so that all IPEDS respondents encounter the time use and burden question regardless of the component or data submission process they employ.


Recommendations for Further Revisions

If a decision is made to replace the current time use and burden question with a variation of the tested alternative questions, the following series of questions based on respondents’ feedback are recommended:


Question 1: How many staff from your institution only were involved in the data collection and reporting process of this survey component?

_______ Number of Staff (including yourself)

Shape8

Question 2: How many hours did you and others from your institution only spend on each of the steps below when responding to this survey component?

Exclude the hours spent collecting data for state and other reporting purposes


Staff member

Collecting Data Needed

Revising Data to Match IPEDS Requirements

Entering Data

Revising and Locking Data

Your office

______ hours

______ hours

______ hours

______ hours

Other offices

______ hours

______ hours

______ hours

______ hours


Shape9


These two questions address the main concern of collecting complete and accurate institutional time use and burden by reminding respondents to include time spent by all of the staff involved in the process as well as time spent on all of the steps of the process. For Question 2, some respondents noted that the question was long and confusing and suggested shortening the question and adding the italicized qualification at the end.

Appendix A. Round 1 Cognitive Interview Protocol


Name(s)/Title(s):

Institution:

Phone:

Email:

Address:

[Confirm above respondent(s) contact information, if necessary]


Thank you for taking the time to speak with us today. My name is__________, and my colleague, ___________, are members of the American Institutes for Research (AIR) project team conducting interviews to improve the questions collecting time and burden reports in Integrated Postsecondary Education Data System (IPEDS). AIR, a research firm headquartered in Washington, D.C., has been contracted by the National Center for Education Statistics (NCES) to learn about your experiences reporting time and burden for IPEDS.


NCES would like to better collect and record time and burden from individuals who report data for IPEDS. You were chosen to participate in this interview because we value your opinion and would like to hear your perspective about time and burden reporting for IPEDS. We recognize that everyone has different experiences with IPEDS and want to stress that there are no right or wrong answers. In order for us to advise NCES on how to improve the collection of time and burden for IPEDS, we need to hear your thoughtful and honest feedback.


The interview should take about 1 hour. I will be asking the questions, and _______________ will be taking notes. We would like to record our conversation to make sure that we catch all of the important information that you will share with us. The recording will serves as our back up to the notes that _______________ will take today. NCES will have access to the recording. Do you consent to participate and is it okay for me to record you? Do you have any questions before we get started?



Background Information:

  • Before we begin, we’d like to get some background information about your role(s) at [institution name].

  • How would you describe your overall role(s) in the institution?

  • What are your day-to-day activities?

  • How would you describe your responsibilities for IPEDS?

  • How were you selected to report your institution’s data through IPEDS?

  • Did you complete any special training or certification for IPEDS data collection and entry?

  • How long have you been involved with IPEDS?


Data Submission Process:

  • NCES is interested in how you track and report time and burden for IPEDS.

  • Can you explain, in your own words, what the data collection process looks like at your institution, from start to finish for the IPEDS collections?

Additional probes as needed:

        1. Start with the communication you receive about the collection before the submission period opens. Who receives the communications?

        2. How do you gather the information required to complete the surveys?

        3. Do you complete the surveys yourself or are you assisted by other staff members?

        4. Do you interact with any other department during the IPEDS collection?

        5. How early in advance of a collection do you begin to prepare?

        6. When and how often do you collect IPEDS related data?

Reporting Time and Burden:

  • Currently there is one question regarding burden at the end of the 11 IPEDS surveys.

Example:

      • In your own words, what is this question asking you?

  • How would you answer this question as if you had just completed the Student Financial Aid component?

      • How did you arrive at your answer?

      • Tell me more about why you answered ## hours and ## minutes.

      • How certain are you of your response?

      • What does “prepare” mean to you?

      • What activities come to mind when you answer this question?



  • Is this the same process that you use for this question for all of the components?

[REPEAT ABOVE QUESTIONS IF THE ANSWER IS NO]


  • Additionally, there are sentences in the _________________ that describe why IPEDS is collected the amount of time it takes to prepare the survey:

The time it took to prepare this component is being collected so that we can continue to improve our estimate of the reporting burden associated with IPEDS. Please include in your estimate the time it took for you to review instructions, query and search data sources, complete and review the component, and submit the data through the Data Collection System.”

      • What does “you” mean to you in the context of this question?

Conclusion:

  • How burdensome was it to respond to the IPEDS collection? Not at all burdensome, a little burdensome, somewhat burdensome, or very burdensome.

  • Based on your answer to the previous question, please explain why you thought the IPEDS collection was not at all burdensome, a little burdensome, somewhat burdensome, or very burdensome?

  • If you had to estimate how much time you spend on all of the IPEDS surveys, throughout the year, what would you estimate?

Appendix B. Round 2 Cognitive Interview Protocol


Name(s)/Title(s):

Institution:

Email:

Phone:


Thank you for taking the time to speak with us today. My name is__________, and my colleague, ___________, are members of the American Institutes for Research (AIR) project team conducting interviews to improve upon the question collecting time and burden reports in Integrated Postsecondary Education Data System (IPEDS). AIR, a research firm headquartered in Washington, D.C., has been contracted by the National Center for Education Statistics (NCES) to learn about your experiences reporting time and burden for IPEDS.


NCES would like to better collect and record time and burden from individuals who report data for IPEDS. You were chosen to participate in this second round interview because we value your opinion and would like to hear your perspective about time and burden reporting for IPEDS. We recognize that everyone has different experiences with IPEDS and want to stress that there are no right or wrong answers. In order for us to advise NCES on how to improve the collection of time and burden for IPEDS, we need to hear your thoughtful and honest feedback.


The interview should take about 60 minutes. I will be asking the questions, and _______________ will be taking notes. As part of the interview we would like to share screens with you, so I’d like to check – are you in front of a screen? [walk R through clicking on the GoToMeeting link]


Additionally, we would like to record our conversation to make sure that we catch all of the important information that you will share with us. The recording will serves as our back up to the notes that _______________ will take today. NCES will have access to the recording. Is it okay for me to record you?


[Once recording starts] Now that we are recording, could you answer these 2 questions: Do you consent to be part of this study? Is it okay for us to record our session today?


Please let us know at any time if you would like to take a break or if you feel uncomfortable. Additionally we would like you to know that there are no risks associated with this study. Do you have any questions before we get started?



Background Information:

  • Before we begin, we’d like to get some background information about your role(s) at [institution name].

  • How would you describe your overall role(s) in the institution?

  • What are your day-to-day activities?

  • How would you describe your responsibilities for IPEDS?

  • How were you selected to report your institution’s data through IPEDS?

  • Did you complete any special training or certification for IPEDS data collection and entry?

  • How long have you been involved with IPEDS?


Currently Reporting Time and Burden:

  • Currently there is one question regarding burden at the end of each of the 11 IPEDS surveys.

Example:

      • How would you answer this question for your institution?

      • In your own words, what is this question asking you?

      • How did you arrive at your answer?

      • How certain are you of your response?

      • What activities come to mind when you answer this question?

      • Are these activities different for other components?

      • What does “prepare” mean to you?

      • If you were to replace “prepare” in this question, what word would you use instead?

      • How would you interpret this question if “institution” was added? Revised statement would read, “How long did it take your institution to prepare this survey component?”



  • Additionally, there are currently two sentences below the time and burden question that describe why IPEDS collects the amount of time it takes to prepare the survey:

The time it took to prepare this component is being collected so that we can continue to improve our estimate of the reporting burden associated with IPEDS. Please include in your estimate the time it took for you to review instructions, query and search data sources, complete and review the component, and submit the data through the Data Collection System.”

      • Do you recall reading these sentences when you last completed IPEDS?

      • Would you find it more helpful to have this statement above the burden question?

      • What does “you” mean to you in the context of this question?

      • If you were to replace “you” in this question, what word would you use instead?

      • Would you respond to the burden question differently if “institution” replaced “you”?

Alternate Time and Burden Reporting:

  • The IPEDS team at AIR has drafted an alternate time and burden question for your consideration. During this section of the interview, I’d like you to think-aloud as you read through each question since we would like to hear your thoughts about it.



[Depending on Respondent, use either Libraries, Enrollment, HR, Finance, or Financial Aid as the initial hypothetical survey component]

Question 1: Were any data used in this survey component also collected for the following reporting purposes? Please mark “Yes” or “No” for each item below.

  1. State required reporting …..................... □Yes □No

  2. Internal institutional reporting …........... □Yes □No

  3. Other required reporting ….................... □Yes □No


Shape12

    • How would you answer this question for your institution?

    • In your own words, what is this question asking you?

    • What does “collected” mean to you in the context of this question?

    • What does “report” mean to you in the context of this question?

    • What are you including when you answered [A, B, and/or C]?

Additional probes as necessary:

    1. How do you gather the information required to complete the surveys?

    2. How early in advance of a collection do you begin to prepare?

    3. How would you rewrite this question?

Question 2: Did you need to coordinate with staff in other offices to complete this survey component?

Yes

No



Shape13

    • How would you answer this question for your institution?

    • In your own words, what is this question asking you?

    • What does “staff” mean to you in the context of this question?

    • What does “coordinate” mean to you in the context of this question?

Shape14

Question 3: Including yourself, how many staff from your institution were involved in the data collection and reporting process of this survey component?

Number of Staff

Shape15

    • How would you answer this question for your institution?

    • In your own words, what is this question asking you?

    • What does “staff” mean to you in the context of this question?

    • What does “involved” mean to you in the context of this question?

    • How did you arrive at your answer?

    • Who are you including in your count?

Additional probes as necessary:

  1. Do you complete the surveys yourself or are you assisted by other staff members?

  2. Do you interact with any other department during the IPEDS collection?

  3. Was there any particular staff member you questioned whether you should include or not? If yes, please tell me more.

  4. How would you rewrite this question?

Question 4. Excluding the hours spent collecting data for state and other reporting purposes, how many hours did you and others spend on each of the steps below when responding to this survey component?


Staff member

Collecting Data Needed

Revising Data to Match IPEDS Requirements

Entering Data

Revising and Locking Data

Your office

______hours

______hours

______hours

______hours

Other offices

______hours

______hours

______hours

______hours


Shape16

    • How would you answer this question for your institution?

    • In your own words, what is this question asking you?

    • How did you arrive at your answer?

    • How certain are you of your responses?

    • Who are you including in your count?

    • What steps, if any, would you add or combine?

Additional probes as necessary:

  1. How would you rewrite this question?



  • How would your answers to the above questions differ with other components?

  • What are your thoughts about these alternate time and burden questions?

  • Do you feel the alternative burden questions clarify the information NCES would like you to submit in the burden question?

Burden Reporting:

  • How burdensome would it be to respond to the alternate time and burden questions? [Not at all burdensome, a little burdensome, somewhat burdensome, very burdensome]

Last Question:

Question 5: How would you rate your knowledge and skills in the following areas?


Mark (X) one box on each line.

Areas

Very poor

Poor

Fair

Good

Very good







Computer programming skills

IT Knowledge



Shape17

    • In your own words, what is this question asking you?

    • Are there any additional knowledge and skills that you believe contribute to your ability to complete the IPEDS components?

Appendix C. Summary Tables


Table C-1. Characteristics of cognitive interview participants for Round 1 of the IPEDS Time and Burden Study

Cognitive interview

Size

Level

State

Part of a system

Years of IPEDS experience

Interview #1

Medium

2-year

VA


Less than 3 years

Interview #2

Large

4-year

VA


6 to less than 12 years

Interview #3

Small

Less than 2-year

MD

Yes1

6 to less than 12 years

Interview #4

Large

2-year

MD

Yes2

6 to less than 12 years

Interview #5

Large

4-year

TX


12 to less than 20 years

Interview #6

Small

Less than 2-year

IL

Yes3

Less than 3 years

Interview #7

Medium

4-year

MD


Less than 3 years

Interview #8

Medium

4-year

VA


6 to less than 12 years

Interview #9

Medium

Less than 2-year

VA

Yes3

3 to less than 6 years

Interview #10

Medium

4-year

MD

Yes2

20 years or more

Interview #11

Medium

2-year

MD


Less than 3 years

Interview #12

Medium

2-year

VA

Yes2

6 to less than 12 years

Interview #13

Medium

4-year

MD


3 to less than 6 years

Interview #14

Medium

4-year

MD

Yes2

20 years or more

Interview #15

Large

2-year

NC


R1: 12 to 20 years

R2: 6 to less than 12 years

R3: 6 to less than 12 years

R4: Less than 3 years

Interview #16

Medium

4-year

VA

Yes2

20 years or more

Interview #17

Small

4-year

NC


3 to less than 6 years

Interview #18

Medium

2-year

NC

Yes2

R1: 6 to less than 12 years

R2: 3 to less than 6 years

Interview #19

Medium

2-year

NC

Yes2

6 to less than 12 years

Interview #20

Small

2-year

FL

Yes2

6 to less than 12 years

Interview #21

Large

4-year

NC

Yes2

12 to less than 20 years

Interview #22

Medium

4-year

NC


Less than 3 years

Interview #23

Small

Less than 2-year

NC


6 to less than 12 years

Interview #24

Small

Less than 2-year

FL


3 to less than 6 years

1 Keyholder volunteered that the institution is part of a system, but reports IPEDS data independently.

2 Keyholder volunteered that the institution is affiliated with a system office which imports data directly into IPEDS on behalf of the institution, provides an IPEDS-ready file for the institution to upload, requires the IPEDS data be submitted to the system office for review before the IPEDS deadline, or otherwise assists with IPEDS survey completion. Examples of these system offices include the Virginia Community College System, the state of Maryland, the state of Florida, and the North Carolina Community Colleges System.

3 Keyholder volunteered that they work on IPEDS for multiple campuses within a system.

4 Keyholder volunteered that they typically do not provide time and burden estimates to IPEDS.

5 Keyholder was not asked this question.

Source: NCES, IPEDS Data Center, 2015–16 and interviews, 2017


Table C-2. Characteristics of cognitive interview participants for Round 2 of the IPEDS Time and Burden Study

Cognitive Interview

Size

Level

State

Part of a system

Years of IPEDS Experience

Interview #1

Small

4-year

FL


Less than 3 years

Interview #2

Large

4-year

FL

Yes3

20 years or more

Interview #3

Large

2-year

NC


Less than 3 years

Interview #4

Small

2-year

VA


Less than 3 years

Interview #5

Large

2-year

VA

Yes2

6 to less than 12 years

Interview #6

Large

2-year

VA

Yes2

Less than 3 years

Interview #7

Medium

2-year

CA

Yes3

6 to less than 12 years

Interview #8

Medium

2-year

CA


12 to less than 20 years

Interview #9

Medium

4-year

VA


3 to less than 6 years

Interview #10

Medium

4-year

NC


3 to less than 6 years

Interview #11

Medium

2-year

CA


Less than 3 years

Interview #124

Medium

2-year

CA

Yes3

12 to less than 20 years

Interview #13

Small

Less than 2-year

CA


3 to less than 6 years

Interview #14

Large

4-year

VA


12 to less than 20 years

Interview #15

Small

Less than 2-year

CA


6 to less than 12 years

Interview #16

Large

2-year

VA

Yes2

12 to less than 20 years

Interview #17

Small

Less than 2-year

CA


6 to less than 12 years

Interview #18

Medium

Less than 2-year

CA

Yes3

12 to less than 20 years

Interview #194

Medium

2-year

CA


---5

Interview #204

Medium

4-year

CA

Yes3

3 to less than 6 years

Interview #21

Small

4-year

VA


3 to less than 6 years

Interview #22

Small

Less than 2-year

TX

Yes3

3 to less than 6 years

Interview #23

Medium

Less than 2-year

TX


6 to less than 12 years

Interview #24

Medium

2-year

TX


R1: 3 to less than 6 years

R2: 6 to less than 12 years

1 Keyholder volunteered that the institution is part of a system, but reports IPEDS data independently.

2 Keyholder volunteered that the institution is affiliated with a system office which imports data directly into IPEDS on behalf of the institution, provides an IPEDS-ready file for the institution to upload, requires the IPEDS data be submitted to the system office for review before the IPEDS deadline, or otherwise assists with IPEDS survey completion. Examples of these system offices include the Virginia Community College System, the state of Maryland, the state of Florida, and the North Carolina Community Colleges System.

3 Keyholder volunteered that they work on IPEDS for multiple campuses within a system.

4 Keyholder volunteered that they typically do not provide time and burden estimates to IPEDS.

5 Keyholder was not asked this question.

Source: NCES, IPEDS Data Center, 2015–16 and interviews, 2017

Table C-3. Selected characteristics, perception of current time use and burden reporting, and attention to additional time and burden question instructions of cognitive interview participants for Round 1 of the IPEDS Time and Burden Study

Cognitive Interview

Size

Level

Measure of perceived burden

Noticed the instructions

Interview #1

Medium

2-year

Somewhat burdensome1

---2

Interview #2

Large

4-year

Somewhat burdensome

---2

Interview #3

Small

Less than 2-year

Somewhat burdensome

---2

Interview #4

Large

2-year

Very burdensome

Yes

Interview #5

Large

4-year

A little burdensome

Yes3

Interview #6

Small

Less than 2-year

Somewhat burdensome

Yes3

Interview #7

Medium

4-year

Somewhat burdensome

No

Interview #8

Medium

4-year

---2

No

Interview #9

Medium

Less than 2-year

Somewhat burdensome

No

Interview #10

Medium

4-year

A little burdensome

Yes3

Interview #11

Medium

2-year

Very burdensome1

Yes3

Interview #12

Medium

2-year

Not at all burdensome

Yes3

Interview #13

Medium

4-year

Somewhat burdensome

---2

Interview #14

Medium

4-year

A little burdensome

Yes3

Interview #15

Large

2-year

A little burdensome4

---2

Interview #16

Medium

4-year

Somewhat burdensome

No

Interview #17

Small

4-year

Somewhat burdensome

No

Interview #18

Medium

2-year

Very burdensome

---2

Interview #19

Medium

2-year

A little burdensome

Yes3

Interview #20

Small

2-year

Very burdensome for two components; somewhat burdensome for all others

Yes

Interview #21

Large

4-year

Not at all burdensome

Yes2

Interview #22

Medium

4-year

Somewhat burdensome

No

Interview #23

Small

Less than 2-year

Not at all burdensome

No

Interview #24

Small

Less than 2-year

Very burdensome1

No

1 Keyholder volunteered that their institution is making changes to the way they report data to IPEDS, and that they expect time and burden to be reduced in the future as a result.

2 Keyholder was not asked this question.

3 Eight keyholders of the ten who responded “Yes” indicated that they were familiar with these or similar instructions, but either couldn’t recall the details or could not remember reading these exact sentences.

4 Interview was conducted with four respondents. All agreed on “A little burdensome.”

Source: NCES, IPEDS Data Center, 2015–16 and interviews, 2017


Shape20

Shape21 Shape22

1 The question regarding the instructions was not added to the protocol until after the third interview. As a result, the first 3 respondents were not asked questions regarding the instructions.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleCognitive Interview Findings for the Adult Training and Education Survey (ATES) 2017: Final Report
SubjectCognitive Interview Findings for the Adult Training and Education Survey (ATES) 2017: Final Report
AuthorMeghan McQuiggan and Stephanie Cronen
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy