SLDS Survey 2017-2019 Response to Passback

SLDS Survey 2017-2019 Responses to Passback.docx

State Longitudinal Data System (SLDS) Survey 2017-2019

SLDS Survey 2017-2019 Response to Passback

OMB: 1850-0933

Document [docx]
Download: docx | pdf

Memorandum United States Department of Education

Institute of Education Sciences

National Center for Education Statistics


DATE: October 5, 2016, revised January 24, 2017


TO: Robert Sivinski and E. Ann Carson, OMB


THROUGH: Kashka Kubzdela, OMB Liaison, NCES


FROM: Nancy Sharkey, SLDS Program Officer, NCES

Kristen King, SLDS Program Officer, NCES


SUBJECT: Statewide Longitudinal Data System (SLDS) Survey 2017-2019 – Responses to OMB Passback (OMB# 1850-New v.1)



This first portion of this memorandum is in response to OMB’s passback of October 28, 2015 (regarding a draft of the proposed Statewide Longitudinal Data System (SLDS) survey instrument), which requested for the draft instrument to be reviewed by a methodological expert within NCES to assure that the questions and instructions are well formulated and provide sufficient guidance to encourage accurate and repeatable results. In late 2015, the SLDS staff asked Dr. Andy Zuckerberg, of NCES, to assess the instrument.

Dr. Zukerberg Comments: Consider piloting the SLDS Survey to grantee States to receive feedback in accordance to stated feedback and any concerns, and make updates accordingly; review and revise the skip pattern to ensure that question association and conditional response is in the logical format; provide more comprehensive (but concise) definition to key concepts referenced in the SLDS Survey, which might or might not be familiar to survey respondents; and consider shortening the instrument.

SLDS Program Office’s Actions: In response, the SLDS staff piloted the SLDS survey in 2016 with the Kentucky, Minnesota, and Washington State Project Teams. Each participating state education agency was given approximately two weeks to complete the survey with notification that survey completion might require collaboration from other SLDS stakeholders, outside of the immediate project team. Once completed, a debrief teleconference was held to discuss possible improvements, suggestions, and other feedback. In general, pilot participants indicated that they preferred the SLDS Survey over the leading non-government survey designed to measure State’s progress towards SLDS development and implementation, which has been conducted by the Data Quality Campaign (DQC) in the past, but has not been administered in the past five years. Based on Dr. Zukerberg’s comments, the SLDS staff’s revised the skip pattern in the draft instrument before piloting it, and then again based on the pilot States’ feedback. Also, before piloting, the SLDS team revised the overall structure, content, instructions, concept definitions, and language of the draft survey. The pilot States did not have questions about the terminology used in the survey and indicated that they understood what was being asked. Based on their feedback, we added a comment box to the end of the survey to facilitate future State respondents submitting suggestions and feedback.

The primary SLDS Survey respondents will be more often than not current or past SLDS Program Directors responsible for the management and oversight of SLDS grants and/or work within their state. Through monthly and ongoing communication, annual reporting, completion of the Annual Performance Reporting, and participation in technical assistance activities as well as numerous interactions with the broader SLDS community, active grantees are quite familiar with the Survey concepts requiring response. The large majority of States that do not have active grants have either had one in the past, continue to engage in the SLDS community despite the absence of SLDS funding, or receive some component of technical assistance from the SLDS Program Office. Our understanding is that current and past SLDS Program Directors are knowledgeable enough about the Survey content to communicate and consult with constituents outside their agency (in the broader P-20W world) as needed to complete the SLDS survey that was born out of the pilot.

State pilot participants were satisfied with the length of the SLDS survey, stating that while it is somewhat extensive, it is comprehensive in assessing the current state and robustness of States’ SLDS and P20W capacities.

This second portion of the memo addresses the second OMB passback provided on January 19, 2019, that indicated that in the survey, NCES included questions about data use and also used the term “stakeholders”, and asked whether stakeholders vary across states, and whether it is important to know exactly who is using the system in each state and, if so, whether a question should be added allowing for types of individuals to be identified or whether the definition of stakeholder is understood to encompass a fixed group of people regardless of state?

To clarify this issue in the instrument, NCES made a few revisions to the instrument, as outlined below:

The term stakeholder came up four times in the instrument:

  1. First in the instructions:

The feature status options are:

  • Not Planned - The state is currently not planning to include that element/capability in its SLDS. “Not Planned” should also be marked for items that are not applicable to your state SLDS at this time (legislative prohibitions, “unadopted” interest, etc.;

  • Planned - The state intends to include this element/capability in its SLDS and has a documented plan and funding source to implement, but implementation work has not begun;

  • In Progress - The state is currently building or implementing this element/capability as part of its SLDS, but it is not yet fully operational; and

  • Operational - This element/capability is fully functional and available for use by its intended stakeholders

We revised the last bullet point to read:

  • Operational - This element/capability is fully functional and available for its intended users.

  1. Then in Data Use question 41c:

41) Additional federal and state reports produced by the SLDS include: (If no additional federal and state reports are planned, skip to 41)

  1. Usage statistics by stakeholder groups (Teachers, Administrators, SEA, Public, etc.)

where we revised bullet point (c) to read:

c) Usage statistics by user role (Teachers, Administrators, SEA, Public, etc.)

  1. And lastly in Data Use question 42:

42) How does the state find out how critical stakeholders are using the SLDS dashboards/reports/tools? (If stakeholder engagement is not planned, survey complete)

where we revised question 42 to read:

42) How does the state find out how critical stakeholders and stakeholder groups are using the SLDS dashboards/reports/tools2?

2 Critical stakeholders and stakeholder groups, sometimes referred to as user roles, are identified by and unique to each State. They include individuals and groups ranging from the public to the State’s senior government officials, and often depend on the data sources included within the State’s SLDS, investment in SLDS initiatives and programs, and overall State objectives and priorities.

Other questions in the instrument already ask about the types of uses the State’s SLDS is designed to support (questions 4, 9, 16, 22, 28, and 36) so questions 41 or 42 do not focus on who the users are, or their roles as stakeholders. Thus, question 41 does not ask about the user roles for which the State has built their SLDS, but whether the State is collecting and producing statistics on usage. Question 42, in turn, focuses on how the State is keeping track of usage (if this is taking place).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMemorandum United States Department of Education
Authoraudrey.pendleton
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy