IADA 30-day Part B 06-12-24

IADA 30-day Part B 06-12-24.docx

Evaluation of the Innovative Assessment Demonstration Authority Pilot Program Survey Data Collections

OMB:

Document [docx]
Download: docx | pdf


Evaluation of the Innovative Assessment Demonstration Authority Pilot Program- Survey Data Collections

Supporting Statement for Paperwork Reduction Act Submission

PART B: Collection of Information Employing Statistical Methods

Revised June 2024

Original 60-day Package Submission: August 2020

Contract # 91990019C0059







Submitted to:

Institute of Education Sciences

U.S. Department of Education

Submitted by:

Westat

An Employee-Owned Research Corporation®

1600 Research Boulevard

Rockville, Maryland 20850-3129

(301) 251-1500

Contents

Page

B.1. Respondent Universe and Sample Design 1

B.2. Information Collection Procedures 3

B.2.1. Notification of the Sample, and Data Collection 3

B.2.2. Statistical Methodology for Stratification and Sample Selection 4

B.2.3. Estimation Procedures 4

B.2.4. Degree of Accuracy Needed 5

B.2.5. Unusual Problems Requiring Specialized Sampling Procedures 5

B.2.6. Use of Periodic (less than annual) Data Collection to Reduce Burden 5

B.3. Methods for Maximizing the Response Rate 5

B.4. Test of Procedures 5

B.5. Individuals Consulted on Statistical Aspects of Design 6

Appendix A. IADA System Director Data Collection Instruments A-1

Appendix B. IADA System Technical Advisory Committee Member Data Collection Instruments B-1

Appendix C. IADA System Assessment Vendor Data Collection Instruments C-1

Appendix D. CGSA Project Director Data Collection Instruments D-1

Appendix E. Notification Materials and Follow-up Emails E-1

Part B. Collection of Information Employing Statistical Methods

The Institute of Education Sciences (IES) is requesting clearance to administer surveys and follow-up interviews for a Congressionally mandated evaluation of the Innovative Assessment Demonstration Authority (IADA) program. Congress created the program to improve the quality and usefulness of assessments required under the Every Student Succeeds Act of 2015 (ESSA), by allowing the U.S. Department of Education (the Department) to exempt states that agree to pilot new assessments from certain testing requirements.1 IES had initially requested clearance in 2020 to survey participating IADA districts, schools, and teachers on the implementation of these new assessments.2 However, the COVID-19 pandemic and other challenges extensively slowed implementation, and two states have since withdrawn from IADA.3,4 As a result, there are few available districts, schools, and teachers that actually administered the IADA assessment and could share their experiences in ways that would meaningfully contribute to this evaluation. IES is instead requesting to survey and conduct follow-up interviews with state assessment officials, technical advisors, and assessment vendors. Their perspectives will be more informative at this stage of the program and will lead to a much smaller overall burden on respondents.

B.1. Respondent Universe and Sample Design

The Department approved two states in the first IADA cohort (Louisiana and New Hampshire, 2018), two in the second cohort (North Carolina and Georgia, 2019), and one state in the third cohort (Massachusetts, 2020). Each state was approved for one assessment system, except for Georgia, which had two innovative assessments.

The evaluation will include summer 2024 surveys and interviews with the assessment director, Technical Advisory Committee (TAC) members, and assessment vendor representatives for all IADA systems in the first three cohorts. Including all IADA systems will allow the evaluation to capture all the variation in the approved assessment systems. In addition, although New Hampshire and Georgia withdrew from IADA in 2022 and 2023, they provide a unique perspective on the challenges of developing and implementing an IADA assessment system.

The requirements for IADA participation are considerable, and those requirements could potentially impact system’s innovativeness. To better understand the context of “innovation under IADA” compared to “innovation beyond IADA,” the evaluation will draw a sample of grantees from the Competitive Grants for State Assessments (CGSA) program. This U.S. Department of Education program provides funds to states to “enhance the quality of assessment instruments and assessment systems used by States for measuring the academic achievement of elementary and secondary school students.”5 The CGSA program has fewer requirements than the IADA program and provides funding but does not allow students in participating schools to take the innovative assessment instead of the state’s current accountability assessment.

The evaluation will draw a sample of five CGSA grantees funded under the 2020 or 2022 CGSA competitions and currently pursing assessment innovations.6 Both competitions encouraged innovation in assessment among states not currently participating in IADA. The 2020 competition tied funding to preparing for an IADA application, signaling an intent to pursue innovative assessment; or to developing, evaluating, and implementing new assessment item types for use in summative assessments in reading/language arts, mathematics, or science.7 Five states received a 2020 grant. The 2022 competition tied funding to measuring achievement in multiple ways, or to the development of assessment instruments (for example, performance and technology-based assessments) aligned with competency-based education models.8 Ten states received a 2022 grant. Three states (Hawaii, Louisiana, and Nebraska) received a CGSA grant in both 2020 and 2022 – so 12 unique states received an award in 2020 or 2022.9

Using CGSA award information, the evaluation purposively selected five CGSA grantees. The evaluation eliminated the three IADA states that also received a CGSA grant, the two states that were not yet implementing an assessment (Kentucky and New York), a state focused on developing an alternate assessment (Arkansas), and a state that ended its CGSA participation (Illinois).

Therefore, the CGSA sample includes the remaining five CGSA states (Hawaii, Missouri, Montana, Nebraska, and Texas), the five states that were planning to implement widespread innovations in assessment that have the potential to alter their accountability systems. This makes them an ideal comparison group for the IADA assessment systems because:

  • They are implementing innovation, but without IADA’s restrictions;

  • They were purposefully funded to pursue assessment innovations; or

  • They received funding to prepare IADA applications in the future, so these systems may eventually serve as part of a state’s accountability program.

The evaluation limits non-IADA states to the sampled CGSA grantees. First, the potential universe of assessment innovations is very large, even if limited to the United States. Many of these assessments have innovative features, and some mimic the innovative features used by the IADA states. However, these assessments do not typically share the purpose of the IADA, which is to generate student-level proficiency determinations that can be applied directly in lieu of the state summative assessment in the state’s accountability system. Many assessment innovations may fall outside what might be feasible for an accountability program, and therefore unsuitable for IADA. For example, a “stealth” game-based assessment that did not alert educators or students that they were being assessed might be considered highly innovative but might not provide reliable accountability measures.10

The sample of CGSA grantees gives the evaluation team an appropriately similar sample of innovative assessments to review and limits the range of potential assessments to the ones that might be most comparable to (and thus informative for) the assessments created by IADA states. For those reasons, the team will survey and interview only the sample of five CGSA grantees noted above that have similar purpose statements. This will help the evaluation describe “innovation under IADA,” in a broader context.

B.2. Information Collection Procedures

B.2.1. Notification of the Sample, and Data Collection

All targeted respondents will be invited to participate in a voluntary survey and interview. The data collection process and activities will be similar for all targeted respondents.

Notification and survey invitations. Upon receiving OMB approval, the evaluation team will send a notification letter (see Appendix E) from the evaluation’s director to the IADA assessment director and the CGSA project director that explains the purpose and scope of the evaluation and the voluntary survey and interview. Also included will be a letter of support for the evaluation signed by the Assessment Team lead for the IADA and CGSA programs on U.S. Department of Education letterhead. This letter of support will help establish the legitimacy and importance of the evaluation.

The notification letter to the IADA assessment director will include a list of the system’s TAC members and assessment vendor representatives gathered from system materials, such as the APR, and will ask the director to confirm the list and identify the vendor with the primary responsibility for each major assessment implementation activity the system has engaged in to at least some extent. This will ensure that the evaluation team contacts the correct TAC members and assessment vendors. The letter to the director also will request that they send a letter to IADA TAC members and assessment vendor representatives encouraging their participation in evaluation activities. The notification letter to the assessment director will include sample text that they may wish to use in their communications with TAC members and vendor representatives about the evaluation.

Upon confirmation of contact information from the IADA assessment director, the evaluation team will send notification letters that include a link to the online survey to the IADA TAC members and IADA assessment vendor representatives. Three business days after sending the notification letter to the IADA assessment director, the evaluation team will send a survey invitation letter. The team will send reminder emails to respondents to complete the survey and follow-up by phone if necessary.

All communications will include study contact information (toll-free study number and a study email address) for any questions and technical support. Westat will train and assign several research staff to answer the study hotline and reply to study emails, with content questions referred to the study leadership. An internal FAQ document will be developed and updated as needed throughout the course of data collection to ensure that all research staff have the most current information.

Primary data collection for the survey will occur through an online survey platform. However, upon request nonrespondents will be able to complete a Word or PDF version of the survey to return by email, a paper-and-pencil version of the survey, or a phone survey, where the individual provides responses to a project team member. Since the online surveys will include data checks, the team will use the online surveys to enter any responses received in hard copy or by phone. The evaluation team will use a data collection management system to monitor survey response rates.

Interviews. Five business days after receiving each survey response, the evaluation team will send an interview invitation letter (see Appendix E) and request that the respondent identify three preferred times and days for the interview within the evaluation’s window for conducting interviews.

Once the interview time and date has been scheduled, the team member leading the interview will send a meeting invitation introducing themselves and providing the toll-free phone number or link to the secure videoconference line (a Teams meeting) for the interview.

With the respondent’s consent, interviews will be recorded. A team member will also take notes in all interviews as a back up to the recording or in place of the recording if the respondent does not consent to being recorded.

Prior to asking interview questions, the interviewer will obtain informed consent from the respondent. For interviews with multiple respondents (for example, the TAC member interview), the interviewer will obtain informed consent from each interviewee and will request recording consent.

For group interviews, if there is disagreement within the group on a response during the interview, the interviewer will encourage the group to reach a consensus. Recordings will be transcribed and reviewed. If needed, the interviewer will follow up with clarifying questions.

Interviewers will receive training on the IADA and CGSA programs, the evaluation’s purpose, and the data collection instruments. Interviewers will also receive training to facilitate multiple participant interviews and encourage consensus. For each assessment system, interviewers will review survey responses, system profiles, and APRs to avoid asking questions that have already been answered.

B.2.2. Statistical Methodology for Stratification and Sample Selection

There will be no sampling of IADA assessment systems. The evaluation includes the universe of IADA assessment systems as of April 2024. The evaluation includes a purposive sample of CGSA grantees as described in section B.1.

B.2.3. Estimation Procedures

The evaluation will use the universe of IADA assessment systems and select a purposive sample of CGSA grantees. The sample of CGSA grantees will not be used to generalize to the population of CGSA grantees. Therefore, the evaluation will not involve estimation procedures.

B.2.4. Degree of Accuracy Needed

The evaluation will include all IADA systems in the survey and interview for maximum precision. The evaluation will select a purposive sample of CGSA grantees. It is not trying to estimate the population of CGSA grantees with a particular degree of accuracy.

B.2.5. Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems requiring specialized sampling procedures.

B.2.6. Use of Periodic (less than annual) Data Collection to Reduce Burden

The surveys and interviews will be conducted in summer 2024.

B.3. Methods for Maximizing the Response Rate

The evaluation team will use strategies from prior outreach to IADA assessment directors,11 previous interactions with TACs and vendors for other state assessment systems, and other data collections with surveys and interviews to maximize response rates to at least 85 percent.

  • Seven business days after sending the survey invitation letter, the evaluation team will send a follow up email to facilitate response. Study team staff will conduct follow-ups as needed.

  • Five business days after receiving survey responses, the evaluation team will send an interview invitation letter.

  • Three business days later, a team member will call to encourage participation, schedule the interview, and confirm the most challenging major implementation activity for the system based on survey responses.

Based on previous contact with assessment directors, TAC members, and assessment vendors, the evaluation team expects that targeted respondents will respond to email communications. However, evaluation materials also will be sent by postal mail, in case the email is flagged as spam.

The evaluation team will (1) provide clear instructions and user-friendly materials, (2) personalize materials (3) offer technical assistance through a toll-free telephone number and study email box, (3) schedule interviews at a convenient time for the respondent, and (4) track all communications and monitor progress regularly using a data collection management system.

The team will minimize burden by not repeating questions. In addition, the instruments have been pretested to ensure that the instructions are clear and that the questions can be completed within the expected amount of time.

B.4. Test of Procedures

The study team pretested the survey instruments and interview questions with two or three participants from each respondent type: two IADA assessment directors, two IADA vendors, three IADA TAC members, and two CGSA project directors.



B.5. Individuals Consulted on Statistical Aspects of Design

The following individuals were consulted on the study design.

Patty Troppe, Westat, Vice President and Project Director

Arthur Thacker, HumRRO, Principal Investigator

Lauren Decker-Woodrow, Westat, Principal Research Associate

Michelle Osowski, Westat, Senior Research Associate



1 As of April 2024, the Department has approved five states for the IADA program: Louisiana and New Hampshire in 2018, North Carolina and Georgia in 2019, and Massachusetts in 2020. For Georgia, the Department approved two assessment systems to test under IADA— the Georgia MAP Partnership (GMAP) through-year assessment system and the Navvy assessment system.

2 The package (Docket No.: ED–2020–SCC–0144) has already completed the 60-day public comment period, which ended on November 2, 2020.

3 For more information on challenges through 2020-21, see: Troppe, P., Osowski, M., Wolfson, M., Ristow, L., Lomax, E., Thacker, A., & Schultz, S. (2023). Evaluating the Federal Innovative Assessment Demonstration Authority: Early Implementation and Progress of State Efforts to Develop New Statewide Academic Assessments (NCEE 2023-004). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. http://ies.ed.gov/ncee

4 New Hampshire withdrew from IADA in 2022, and Georgia withdrew its two assessments in 2023.

5 The program is authorized under Title I, Part B, Section 6363(b)(1) of the Every Student Succeeds Act. This program replaces a similar program, the Enhanced Assessment Grants (EAG) program authorized by the Elementary and Secondary Education Act, as amended by the No Child Left Behind Act (NCLB).

6 There was no 2021 CGSA competition. The study team chose to focus on programs implemented closer in time to IADA. For example, Enhanced Assessment Grants (EAGs), were the predecessor to the CGSA program. The EAGs were implemented before IADA when states couldn’t innovate beyond the federal accountability requirements. Some EAG grants also focused on specific student populations.

7 The 2020 CGSA competition awarded grants to Louisiana and Massachusetts to support the implementation of their IADA assessment systems. For information on the 2020 applications, see: Office of Elementary and Secondary Education. (2020, May 1). Applications for new awards: Competitive Grants for State Assessment program. U.S. Department of Education. https://www.govinfo.gov/content/pkg/FR-2020-05-01/pdf/2020-09336.pdf

8 In addition to these two absolute funding priorities, the 2022 competition included a competitive funding priority related to improving the utility of information about student performance in reports of assessment results and providing better and more timely reports, and an invitational priority focused on supporting effective instruction and building educator capacity through the development of high-quality assessments of student learning and strategies that allow educators to use data from assessments to inform instruction. See: Office of Elementary and Secondary Education. (2022, February 16). Applications for new awards: Competitive Grants for State Assessment program. U.S. Department of Education. https://www.govinfo.gov/content/pkg/FR-2022-02-16/pdf/2022-03290.pdf

9 The 12 unique states were Arkansas, Hawaii, Illinois, Kentucky, Louisiana, Massachusetts, Missouri, Montana, Nebraska, New York, North Carolina, and Texas.

10 Educators often use classroom assessments for different purposes, so if neither the student nor their teacher is informed that the assessment “counts” for accountability, their treatment of that assessment could vary wildly (for example, administered to teams of students rather than individuals), leading to unreliable and invalid scores for use in an accountability system.

11 The evaluation previously conducted interviews with the first five IADA systems for the IADA Progress Report.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2024-07-22

© 2026 OMB.report | Privacy Policy