Evaluation of the Innovative Assessment Demonstration Authority Pilot Program—Survey Data Collections
Supporting Statement for Paperwork Reduction Act Submission
PART A: Justification
Revised June 2024
Original 60-day Package Submission: August 2020
Contract # 91990019C0059
Submitted to:
Institute of Education Sciences
U.S. Department of Education
Submitted by:
Westat
An Employee-Owned Research Corporation®
1600 Research Boulevard
Rockville, Maryland 20850-3129
(301) 251-1500
Page
A1. Circumstances Necessitating the Collection of Information 1
A.2.1 Data Collection Activities for Which Clearance is Requested as Part of this Package 2
A3. Use of Technology to Reduce Burden 5
A4. Efforts to Avoid Duplication of Effort 5
A5. Methods of Minimizing Burden on Small Entities 5
A6. Consequences of not Collecting Data 5
A8. Federal Register Announcement and Consultation 6
A10. Assurances of Confidentiality 7
A11. Justification for Sensitive Questions 8
A12. Estimates of Hours Burden 8
A13. Estimate of Cost Burden to Respondents 9
A14. Annualized Cost to the Federal Government 9
A15. Reasons for Program Changes or Adjustments 9
A16. Plans for Tabulation and Publication of Results 10
A17. Approval not to Display the Expiration Date for OMB Approval 10
Appendix A. IADA System Director Data Collection Instruments A-1
Appendix B. IADA System Technical Advisory Committee Member Data Collection Instruments B-1
Appendix C. IADA System Assessment Vendor Data Collection Instruments C-1
Appendix D. CGSA Project Director Data Collection Instruments D-1
Appendix E. Notification Materials and Follow-up Emails E-1
Table
Part A. Justification
The Institute of Education Sciences (IES) is requesting clearance to administer surveys and follow-up interviews for a Congressionally mandated evaluation of the Innovative Assessment Demonstration Authority (IADA) program. Congress created the program to improve the quality and usefulness of assessments required under the Every Student Succeeds Act of 2015 (ESSA), by allowing the U.S. Department of Education (the Department) to exempt states that agree to pilot new assessments from certain testing requirements.1 IES had initially requested clearance in 2020 to survey participating IADA districts, schools, and teachers on the implementation of these new assessments.2 However, the COVID-19 pandemic and other challenges extensively slowed implementation, and two states have since withdrawn from IADA.3,4 As a result, there are few available districts, schools, and teachers that actually administered the IADA assessment and could share their experiences in ways that would meaningfully contribute to this evaluation. IES is instead requesting to survey and conduct follow-up interviews with state assessment officials, technical advisors, and assessment vendors. Their perspectives will be more informative at this stage of the program and will lead to a much smaller overall burden on respondents.
Congress mandated two reports: (1) a Progress Report on pilot states developing and implementing innovative assessment systems and (2) a Best Practices Report to inform future development and use of innovative assessment systems in more states. The Progress Report, published in 2023, used existing documents from pilot states for source material, as required by ESSA. That report informed the expansion of the program and will help guide the Department’s technical assistance to pilot states.5 The Best Practices Report will provide the latest updates on pilot states’ progress, as well as add assessment directors’, technical advisors’, and vendors’ perspectives on the development and implementation of IADA systems. To learn more about innovative assessments in general, the report will also include the perspectives of project directors from a sample of states currently implementing innovative assessments outside of the IADA program, including grantees from the U.S. Department of Education’s Competitive Grants for Student Assessment (CGSA) program. This will fulfill the Congressional mandate and help the Department to identify the most significant facilitators and barriers to progress in IADA pilot states.
The evaluation, conducted by Westat, will describe the development and implementation of innovative assessments in the first five states approved for the pilot.6,7 The research questions addressed by the Best Practices Report will be:
How innovative are the IADA assessments, for example using Congress and the Department’s definitions of innovation?
2. How far along were the IADA systems toward implementing key program activities—overall and since 2020-21?
3. What challenges make it difficult for IADA systems to meet IADA’s goals?
4. What development and implementation practices worked well for IADA systems?
5. Are there development and implementation practices that might benefit IADA systems?
6. Are there innovative practices that might benefit IADA systems?
To address these research questions, the evaluation team will review existing documents from Louisiana, New Hampshire, North Carolina, Georgia, and Massachusetts and will survey and interview these states’ IADA program directors, technical advisory committee (TAC) members, and assessment vendors.8 Although New Hampshire and Georgia withdrew from the IADA program, they will provide valuable perspectives on the challenges of developing and implementing IADA assessment systems.
The evaluation team will also review documents and survey and interview project directors from five states outside of IADA that are developing and implementing innovative assessments through CGSA, which provides grants to “enhance the quality of assessment instruments and assessment systems used by States for measuring the academic achievement of elementary and secondary school students” (OESE 2020, p. 25,423). The five CGSA grantees selected for this evaluation were funded to pursue assessment innovation, but without the requirements of the IADA program.
The evaluation’s data collections are listed below in sections A.2.1 and A.2.2.
This package specifically requests clearance for the IADA assessment director, TAC member, and assessment vendor survey and interview instruments; and the CGSA project director survey and interview instruments. This information collection request does not include the extant document review, which will focus on information already reported by the states.
IADA assessment system director data collection. The director will be invited to participate in a survey and follow up interview:
The evaluation team will email the director a link to an online survey asking about the challenges the state experienced trying to carry out six major implementation activities associated with IADA. The estimated time to complete this survey is 20 minutes.
The assessment director will then be invited to participate in a 60-minute follow up interview, focusing in more detail on the major implementation activity identified as most challenging in the survey.9 The interview will ask about the major challenges associated with that activity,10 strategies to address the challenges, and the effectiveness of those strategies. The interview also will ask more generally about what practices worked well and why, those practices that did not work well or as planned and why, and those that the director thought would work well, but that they were unable to try and why.
The IADA assessment system director survey instrument and interview protocol can be found in Appendix A.
IADA TAC member data collection. TAC members advise states on their assessment systems and may have expertise in test design, psychometrics, sampling, accessibility/accommodations for students with disabilities, accessibility/accommodations for English learners, law (related to assessment and accountability), state and federal policy, reporting and test use, test security, or other related specialties. A typical TAC includes between six and eight members. Because a single member may not have the expertise or the experience to reflect on the entire assessment system, the evaluation team will interview all TAC members in a group format. However, each TAC member will be invited to fill out an individual survey.
The evaluation team will email each TAC member a link to an online survey that covers the same content as the IADA assessment director’s survey. The estimated time to complete this survey is 20 minutes.
The TAC members will be invited to participate as a group in one 60-minute interview. The interview will focus on the most common activity identified across the TAC member surveys as the most challenging.11 The interview will ask about the top challenge reported by each TAC member in that most challenging activity, strategies to address the challenge, and the effectiveness of those strategies.12 The interview will also ask TAC members to identify what practices worked well and why, those that did not work well or as planned, and why, and those that the system thought would work well but were unable to try and why. Interviewers will seek consensus across the TAC members.
The IADA TAC member survey instrument and interview protocol can be found in Appendix B.
IADA assessment vendor data collection. Assessment vendors can provide insights into the development and implementation of an assessment. States may rely on more than one vendor to administer its IADA assessment (for example, to develop test items, to render the items on an online platform, to collect data, generate scores, and report scores). For such states, the evaluation team will survey and interview representatives from the vendor with primary responsibility for each of the six major implementation activities associated with IADA. The vendor will be invited to participate in a two-part process like the ones described above.
The evaluation team will email each vendor representative a link to a survey that covers the same content as the IADA assessment director’s survey. The estimated time to complete this survey is 20 minutes.
Each vendor representative will be invited to participate in a separate 60-minute individual interview.13 Like the IADA director interview, the vendor representative interview will focus on the activity identified in the survey as most challenging, the major challenges associated with the activity, and the effectiveness of strategies to address the challenges. The interview also will ask the vendor representative to identify what practices worked well and why, those that did not work well or as planned and why, and those that the system thought would work well but were unable to try and why.
The IADA assessment vendor survey instrument and interview protocol can be found in Appendix C.
CGSA system project director data collection. Each director will be invited to participate in a two-part process like the ones described above:
The evaluation team will email each director a link to an online survey that covers the same content as the IADA assessment director survey, but tailored to the CGSA program as needed. The estimated time to complete this survey is 20 minutes.
Each project director will be invited to participate in a separate 60-minute interview.14 The interview will focus on the major implementation activity that the director identified as the most challenging, the major challenges associated with the activity, and the effectiveness of strategies to address the challenges. The interview also will ask the director to more generally identify practices that worked well and why, those that did not work well or as planned, and those that the system thought would work well but were unable to try and why.
The CGSA assessment system director survey instrument and interview protocol can be found in Appendix D.
Performance Report (APR); technical, administration, accommodation, and scorer manuals for the innovative assessment; test specification documents; information on the system’s theory of action; and the state’s own IADA evaluation, if available. The evaluation team will also examine application materials from Massachusetts to understand the system’s design and objectives.15
The evaluation team will also review existing documents for the five CGSA grantees to understand their assessment systems’ design and objectives. Examples of these documents, collected in 2023 and early 2024, include the state’s CGSA grant application and APR.
The data collection plan aims to efficiently obtain information while minimizing respondent burden. Prepopulating challenges in the online 20-minute survey ensures a systematic and efficient collection of responses and allows the follow-up interview to focus on the most challenging activity rather than digging into many separate challenges. The 60-minute interviews will be conducted virtually, allowing respondents greater flexibility in scheduling and participation compared to an in-person interview. This will be especially helpful since TAC members are likely in different locations. If respondents give consent, the interview will be recorded, which ensures more accurate transcription and reduces the need for clarifying questions after the interview.
There are no other sources that systematically and comprehensively report assessment directors’, TAC members’, and assessment vendors’ perspectives on developing or implementing an innovative assessment system under the IADA program or project directors for the CGSA program. At least some of the IADA and CGSA states publicly disseminate their lessons learned and best practices through conference events.16 But these presentations do not include the systematic review of challenges collected in the IADA evaluation, and do not draw from the experiences of all the IADA systems. In addition, lessons learned and best practices shared in these conference presentations may not be as in-depth as described by this evaluation. IADA conference presentations may also not include TAC members’ and assessment vendors’ perspectives.
To avoid duplication of effort and minimize respondent burden, the study will rely as much as possible on extant documents, such as APRs
To the best of the study team’s awareness, no small businesses or entities will be involved as respondents. Every effort will be made to minimize burden on all respondents.
If the surveys and interviews are not conducted, the Best Practices Report will not include critical lessons from states in the development and administration of the IADA or other innovative assessment systems to inform states deciding whether to pursue innovative assessments and apply to the IADA program. Such a gap would limit the usefulness of the evaluation and prevent it from fulfilling a key objective of the Congressionally mandated evaluation.
There are no special circumstances involved with this data collection. Data collected will be conducted in a manner consistent with the guidelines in 5 CFR 1320.5.
The 60-day Federal Register notice, which requested clearance for surveys of participating districts, schools, and teachers, was published on September 2, 2020, and the 30-day Federal Register notice was published on TBD.
During the 60-day comment period, the Department received comments from three commenters. One respondent’s comments were not related to the evaluation. The other two respondents offered survey wording suggestions. While many of these comments no longer apply because IES is revising the data collection from district/school/teacher surveys to state-level surveys and interviews, several comments continue to be relevant. One commentor suggested that the evaluation include the perspectives of technical advisors; this is now part of the current data collection plan. The same commenter noted that the evaluation should apply an equity lens to evaluate “whether innovations are better measures of what students know and can do, or are merely repackaged versions of current assessment models and practices that will only reinforce and perpetuate equity gaps.” The research question related to assessing how innovative the IADA assessments are will address this comment. The other commenter suggested including a question about the pandemic. While the follow-up interview’s focus is not the pandemic, one question asks whether COVID contributed to an inability to try certain practices. . Respondents can also identify the pandemic as a challenge, if desired, through “other/specify” options in the survey questions and by raising it in response to questions about practices that did not work well. Finally, the already published Progress Report provides some perspective on how the pandemic may have affected IADA states.
This study has a Technical Working Group (TWG) that includes members with expertise on the types of innovative assessments being used by the five IADA states in the evaluation (for example, performance assessment, interim and formative assessments, computer adaptive testing); assessment development, including psychometric properties; the implementation of assessments at the state and local levels; and quantitative and qualitative research methods. The TWG members are:
Randy Bennett, The Norman O. Frederiksen Chair in Assessment Innovation, Research and Development Division, ETS
Peter Leonard, Director of Assessment, Chicago Public Schools
Richard Patz, Distinguished Research Advisor, University of California, Berkeley
Andy Porter, Director, The Center on Standards, Alignment, Instruction, and Learning Professor Emeritus of Education, University of Pennsylvania
Michael Rodriguez, CEHD Associate Dean for Undergraduate Programs, Diversity, and Equity; Campbell Leadership Chair in Education and Human Development; and Professor, University of Minnesota
This TWG has and will continue to advise this evaluation on topics including data collection, analysis, and the reporting of best practices related to the development and implementation of innovative assessments.
The Education Sciences Reform Act of 2002, Title I, Part E, Section 183 requires, “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection c of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” The names and email addresses of potential respondents will be collected for the limited purpose of contacting those selected for the survey and interview, and following up with non-respondents. This information may already be available in the public domain as directory information (i.e., state or assessment vendor websites). The following language will be included on the cover sheet of all information collection forms under the Notice of Confidentiality:
“Information collected for this study comes under the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Responses to this data collection will be used only for research purposes. Participating state assessment systems will be identified, but not individual respondents. All of the information you provide may only be used for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).” Specific steps to guarantee confidentiality of the information collected include the following:
Respondent identifying information (respondent name, email address) will not be entered into the analysis data file, but will be kept separate from other data and will be password protected. A random, study-specific identification number for each respondent will be used in the raw data and analysis files.
A fax server used to send or receive documents that contain confidential information will be kept in a locked room, accessible only to study team members.
Confidential materials will be printed on a printer located in a limited access room. When printing documents that contain confidential information from shared network printers, authorized study staff will retrieve the documents as soon as printing is complete.
The public report will summarize study findings and may include quotes from IADA system assessment directors, TAC members, or assessment vendors and CGSA project directors without naming individual respondents. The report also may present detailed findings by assessment system.
Access to the sample files will be limited to authorized study staff only.
All members of the study team will be briefed regarding required procedures for handling confidential data.
A control system will monitor the status and whereabouts of any hard copy data collection instruments during data entry.
All data will be stored in secure areas accessible only to authorized staff members. Any computer-generated output containing identifiable information will only be accessible to authorized staff members.
Hard copies containing confidential information that are no longer needed will be shredded.
This study will include no questions of a sensitive nature.
As noted earlier, IES initially requested clearance in 2020 to twice survey participating IADA districts, schools, and teachers on the implementation of these new assessments, with a total burden estimate of 1,364 hours. Table A-1 presents the total burden estimate for that request by instrument and respondent. With the current request, IES is instead requesting to survey and conduct follow-up interviews with state assessment officials, technical advisors, and assessment vendors. Table A-2 presents the total burden for the new request by instrument and respondent. With the new request, the total response burden is 72 hours, approximately 1,300 hours less than the original request.
Table A-1. Estimated response time for preliminary activities and survey activities (from the 2020 clearance package)
Timing
& respondent/ |
Number of targeted respondents |
Expected response rate (%) |
Expected number of responses |
Unit response time (hours) |
Annual total response time over 3-year approval for data collection (hours/year) |
Total burden (Hours) |
||
Preliminary Activities (Activities that occur before the survey data collection) |
|
|
|
|
|
|
||
Coordinator -Teacher lists (winter 2021) |
64 |
85 |
54 |
2 |
36 |
108 |
||
Coordinator
– Teacher lists |
77 |
85 |
66 |
2 |
44 |
132 |
||
Spring 2021 |
|
|
|
|
|
|
||
District survey |
64 |
85 |
54 |
0.5 |
9 |
27 |
||
District Coordinator follow-up activities |
64 |
85 |
54 |
0.33 |
6 |
18 |
||
Principal survey |
128 |
85 |
109 |
0.5 |
18 |
55 |
||
School Coordinator activities |
128 |
60 |
77 |
0.5 |
13 |
39 |
||
Teacher survey |
640 |
85 |
544 |
0.5 |
91 |
272 |
||
Spring 2022 |
|
|
|
|
|
|
||
District survey |
77 |
85 |
65 |
0.5 |
11 |
33 |
||
District Coordinator follow-up activities |
77 |
85 |
65 |
0.33 |
7 |
21 |
||
Principal survey |
231 |
85 |
196 |
0.5 |
33 |
98 |
||
School Coordinator activities |
231 |
60 |
139 |
0.5 |
23 |
70 |
||
Teacher survey |
1,155 |
85 |
982 |
0.5 |
164 |
491 |
||
Total for all activities (rounded) |
2,936 |
|
2,405 |
|
455 |
1,364 |
||
Table A-2. Estimated response time for data collection activities, by respondent and activity (for the current package)
Respondent type |
Number of targeted respondents |
Expected
response rate |
Expected number of responses |
Unit
|
Annual
total response time over |
Total
|
IADA Assessment Director |
|
|
|
|
|
|
Survey |
7 |
85 |
6 |
0.33 |
0.66 |
2 |
Interview |
7 |
85 |
6 |
1 |
2 |
6 |
IADA TAC member1 |
|
|
|
|
|
|
Survey |
42 |
85 |
36 |
0.33 |
4 |
12 |
Interview |
42 |
85 |
36 |
1 |
12 |
36 |
IADA Assessment Vendor Representative2 |
|
|
|
|
|
|
Survey |
9 |
85 |
8 |
0.33 |
0.88 |
2.64 |
Interview |
9 |
85 |
8 |
1 |
2.678 |
8 |
CGSA Project Director |
|
|
|
|
|
|
Survey |
5 |
85 |
4 |
0.33 |
0.44 |
1.32 |
Interview |
5 |
85 |
4 |
1 |
1.33 |
4 |
Total for request (rounded) |
126 |
|
108 |
|
24 |
72 |
1 Each TAC is assumed to include seven members on average.
2 Assumes that half of the six systems have two vendors with primary responsibility for major implementation activities. Assumes that one vendor has primary responsibility for all activities in the other systems.
The total response burden for the current package is 72 hours. This total also assumes that the evaluation team will reach out to 7 IADA Assessment Directors, 42 IADA TAC members, 9 IADA Assessment Vendors from the current IADA systems (Louisiana, North Carolina, and Massachusetts) and former IADA systems (New Hampshire and Georgia); and 5 CGSA project directors. The evaluation team expects a response rate of at least 85 percent for each data collection activity, with 6 IADA Assessment Directors, 36 IADA TAC members, 8 IADA assessment vendors, and 4 CGSA project directors submitting a survey and participating in an interview.
There are no annualized capital/startup or ongoing operation and maintenance costs associated with collecting this information.
The cost of the design, data collection, analysis and reporting of the data is $3,622,574. Over the five years of the evaluation, this annualized cost will be $724,515.
This is a revised collection, for which approval is being requested for 72 hours to cover all activities.
The first report described participating states’ progress with their innovative assessment system and drew primarily on extant documentation provided by states. The second report, a descriptive study not designed to estimate the impact of federal policies on state actions, will present best practices, or lessons learned, from the development and implementation of innovative assessment systems, using findings from the surveys, interviews, and extant documents.
The report will examine extant documents to describe innovative features of IADA assessment systems and their progress in meeting key objectives as well as the innovative features of selected CGSA grantees.
Survey responses will be tabulated. Responses from interviews will be coded and aggregated to provide system-level tallies of particular challenges, solutions, practices that worked well or not, and practices that systems wanted to try, but couldn’t. Common themes will be identified, and system-specific examples of the challenges, solutions, or practices will be considered for the narrative discussion of findings. IADA TAC member and assessment vendor responses will be compared to IADA system director responses. (for example, full agreement indicates all three respondent groups identified the same major assessment implementation activity as most challenging).
The Best Practices Report, for an audience of policymakers and practitioners, will be published in 2025. The report, no more than 15 pages plus technical appendices, will be available on the IES website. The reports will follow the recent January 2020 IES Style and Report guidance and meet all 508 compliance requirements.
The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date. The surveys and notification letters will display the OMB approval expiration date.
This submission does not
require an exception to the Certificate for Paperwork Reduction
Act
(5 CFR 1320.9).
1 As of April 2024, the Department has approved five states for the IADA program: Louisiana and New Hampshire in 2018, North Carolina and Georgia in 2019, and Massachusetts in 2020. For Georgia, the Department approved two assessment systems to test under IADA— the Georgia MAP Partnership (GMAP) through-year assessment system and the Navvy assessment system.
2 The package (Docket No.: ED–2020–SCC–0144) has already completed the 60-day public comment period, which ended on November 2, 2020.
3 For more information on challenges through 2020-21, see: Troppe, P., Osowski, M., Wolfson, M., Ristow, L., Lomax, E., Thacker, A., & Schultz, S. (2023). Evaluating the Federal Innovative Assessment Demonstration Authority: Early Implementation and Progress of State Efforts to Develop New Statewide Academic Assessments (NCEE 2023-004). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. http://ies.ed.gov/ncee
4 New Hampshire withdrew from IADA in 2022, and Georgia withdrew its two assessments in 2023.
5 ESSA initially allowed up to seven states (including those participating in a consortia) to establish IADA systems. The Department lifted that cap in November 2023 (see: 23-0431-DCL-IADA-os-approved-11.17.2023.pdf).
6 The Progress Report focused on the early implementation of the IADA systems through the 2020–21 school year and included only the initial four IADA states (Louisiana, New Hampshire, Georgia, and North Carolina). The Best Practices Report will examine implementation through 2022-23 (and will therefore include Massachusetts).
7 The research questions addressed by the Progress Report were: What were the key objectives and features of the IADA systems? How “ready” were the IADA systems at the start of the demonstration to meet early program expectations? How far along were the IADA systems at the end of the 2020–21 school year toward implementing key program activities? What challenges to developing and implementing assessments through 2020-21 did IADA systems report?
8 For Georgia, the evaluation team will interview the former state IADA director as well as the former IADA assessment leads for the Georgia Navvy assessment and the Georgia GMAP through-year assessment.
9 The evaluation team will request an interview with the IADA system director but based on prior experience, the system director may then include other IADA system staff in the interview. In this case, the interviewer will request that the group reach consensus, allowing the evaluation team to use the system as the unit of analysis.
10 If no challenges were identified as major by the respondent, the interview will ask about most difficult minor challenge.
11 For each system, the evaluation team will examine the survey responses across all TAC members to determine the most common, most challenging major activity. If there is one or more ties, the interview will focus on the activity that is earliest in the sequence of major activities. Later activities are dependent on the earlier activities. If there are major challenges early in the sequence of activities, they could have effects later in the implementation sequence. As a result, the study is most interested in learning about the earlier challenges and how they were addressed.
12 The evaluation team will ask the TAC about each top challenge for the activity as long as at least one TAC member identified that top challenge as a major challenge. If none of the TAC members identified any top challenge as a major challenge, the interview will focus on the most difficult minor challenge.
13 The evaluation team will request an interview with a representative from the vendor who is most knowledgeable about the activities for which the vendor has primary responsibility. The vendor may decide to include more than one person in the interview. In cases where there are multiple individuals in an interview, the interviewer will request that the group reach consensus.
14 The evaluation team will request an interview with the CGSA project director, but the project director may decide to include other staff in the interview. In cases where there are multiple individuals in an interview, the interviewer will request that the group reach consensus.
15 The evaluation team reviewed the application materials for the other four IADA states as part of the Progress Report.
16 For example, some IADA and CGSA states presented at the 2023 Council of Chief State School Officers National Conference on Student Assessments (New Orleans, LA). Current IADA states also presented at the OESE, School Support and Accountability, 2023 State Assessment Conference (Arlington, VA).
| File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
| File Modified | 0000-00-00 |
| File Created | 2024-07-22 |