CLSD Grantee Interview

Comprehensive Literacy Program Evaluation: Comprehensive Literacy State Development (CLSD) Program Evaluation

CLSD_Grantee Interview Appendix_FINAL

CLSD Grantee Interview

OMB: 1850-0945

Document [docx]
Download: docx | pdf



Contract Number 91990018C0020

Comprehensive Literacy Program Evaluation: Comprehensive Literacy State Development Grant Program

Grantee Interview appendix

December 11, 2020

Submitted to:

U.S. Department of Education

Institute of Education Sciences

550 12th Street, SW

Washington, DC 20202

Project Officer: Tracy Rimdzius

Contract Number: 91990018C0020

Submitted by:

American Institutes for Research

1000 Thomas Jefferson Street, NW

Washington, DC 20007

Phone: (202) 403-5000

Facsimile: (202) 403-5001

Project Director: Jessica Heppen




Copyright © 2020 American Institutes for Research®. All rights reserved. No part of this publication may be reproduced, distributed, or transmitted in any form or by any means, including photocopying, recording, website display, or other electronic or mechanical methods, without the prior written permission of the American Institutes for Research. For permission requests, please use the Contact Us form on www.air.org.


Introduction

Thank you for your willingness to participate in this interview about the Comprehensive Literacy State Development (CLSD) grant program.

As you know, this interview is being conducted as part of an evaluation of the FY2019 CLSD program commissioned by the U.S. Department of Education. The purpose of this interview is to learn more about your approach to CLSD, your approach to making CLSD subgrants, your approach to using and supporting the use of evidence, and your approach to the competitive preference priorities – provide families with evidence-based strategies for promoting literacy and empowering families and individuals to choose a high-quality education that meets their unique needs – included in the U.S. Department of Education’s notice inviting applications. We anticipate our conversation to be about 60 minutes.

I want to assure you that all research staff on this study adhere to the confidentiality and data protection requirements of the U.S. Department of Education (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). The data we collect will be used for research purposes only. I want to let you know that the information you share with us today may be included in one or more reports we produce as part of our evaluation; however, we will not identify you by name in any reports.

Also, you should know that your participation is voluntary, and you do not have to respond to any questions you do not want to. Please let us know at any time if you would prefer not to participate.

I’d like to ask you to sign a consent form before we begin. It outlines some of the issues I’ve just mentioned with regard to privacy. Please take a minute to read it and let me know if you have any questions.

If you don’t mind, I would like to record this interview simply for note-taking purposes. No one outside of our research team will hear the recording, it is only for my own reference to improve the accuracy of my notes. If you would like me to turn off the recorder at any point, just let me know. Would that be OK?

If yes: Thank you.
If no: We will take notes but will not record today’s conversation.

Do you have any questions before we begin?


Section I. Approach to CLSD

  1. I’d like to start by asking you some basic questions about your approach to CLSD. We know that states are focused on supporting literacy instruction through a variety of activities and resources. These may include curricula or supplemental programs (e.g., core reading programs, writing programs, and programs focused on phonics, vocabulary, comprehension, etc). In addition, or alternatively, these could include instructional practices (e.g., specific literacy strategies that can be implemented in a classroom) supported through coaching, professional development, or embedded in a curricula or program. In addition, these may include resources for teacher learning like teacher manuals and guidebooks on reading development, classroom libraries, community and family programs to support children’s literacy, etc.

  2. Such activities and resources can focus on general education students, students needing additional support (e.g., those below grade level, English learners, or students with disabilities), or both.

From reviewing your application and information you already provided, you appear to have focused your approach to CLSD on the following activities and resources [insert list from review of state application and other policy documents as applicable.]

    1. Is that correct? Are these the key levers that you sought to support through CLSD? Is it accurate to say that your CLSD grant is primarily focused on curricula/practices/ both?

    2. Why did you choose to focus on these particular activities and resources?

    3. Do you have a specific list of curricula /practices/both that you encouraged or required subgrantees to implement? Why?

    4. Is your CLSD grant primarily focused on providing activities and resources for general education students, students needing additional support, or both? Why?



  1. Did the coronavirus pandemic affect your plan for CLSD? [Probe if yes]

    1. If not covered in response: Did the coronavirus pandemic affect your plans for CLSD subgrants, including the timing of your subgrant competition?

    2. If not covered in response: Did the coronavirus pandemic affect your subgrantees’ implementation by changing the activities and resources they planned to provide (e.g., in person PD, virtual reading subscriptions) or delaying implementation?

4. Did the receipt of CARES Act funds affect your plan for CLSD? [Probe if yes]

a. If not covered in response: For example, did you encourage or approve modifications for subgrantees to allocate of funds for particular types of activities (e.g., virtual programs or curricula) or resources (e.g., technology)? Did you encourage or approve modifications for subgrantees to allocate funds for particular types of students (e.g., English learners, students reading below grade level)?

Section II. Approach to Subgrant Awards

Now, I’d like to ask you some basic questions about your approach to awarding subgrants. We know from information you’ve already provided that you awarded X subgrants last year.

  1. How many years were these subgrant awards for?

  1. How many subgrant applications did you receive?

    1. So it seems that you awarded about xx% of the subgrants that you received – does that sound about right?

    2. [If state awarded 100% of applications] Why did you award all the applications that you received? Were they all high-quality? Did you receive fewer than you anticipated?

  2. Did you award any additional subgrants after the first round of awards you made – that is, did you award any additional subgrants this year? [If yes, ask questions 4-6; if no, skip to Q7.]

We have a few questions about your new subgrants [If applicable]

  1. In 2020-21 (the second year you awarded subgrants), how many subgrants did you award?

  2. Are these subgrants a similar size – that is, a similar funding amount per subgrant – as those you gave out in the first year (2019-20)?

  3. How many years are these subgrant awards for?

Now we have a few more questions about the subgrantees that received awards in 2019-20 and your process for reviewing applications and determining those subgrantees.

  1. Could you please describe your state’s process for reviewing the CLSD subgrant applications?

    1. If not covered in response, or previously determined from review of state documents: I’m particularly interested in whether your state had a rubric or evaluation form that you used to evaluate applicants, how reviewers were trained on this form, and what reviewers were asked to provide in terms of scores or recommendations. Probe: Did your state use a rubric or written rules and procedures? What were peer reviewers asked to provide in terms of scores, comments, or recommendations? How did the state use peer reviewers’ comments to make decisions on awards?

      1. What types of people reviewed and scored applications?

Now I’m going to ask about specific features of the subgrant applications that you might have considered in deciding which applicants to fund.

  1. Are there specific features of your state literacy plan that you wanted subgrantees to focus on and align their plans to? Was this alignment one of the factors you considered in making awards? [If yes, continue with a & b; if no, skip to Q9.]

    1. Did you give applications a quantitative score on this factor?

    2. How did you use this factor to make award decisions? How, for example, did you measure subgrantees’ alignment with specific features of your state literacy plan?

  2. Was the proportion or number of low-income and high-need students one of the factors you considered in making awards? [If yes, continue with a & b; if no, skip to Q13.]

    1. Did your state set a minimum number of low-income and high-need students for a district or school to be eligible for CLSD? [If yes, ask i; if no, skip to b.]

      1. If yes: What measure(s) of student “need” did you use in determining eligibility? These could be:

        1. Students below grade-level in reading

        2. Students receiving free or reduced-price lunch

        3. Students with disabilities

        4. English learners

        5. Students living in foster care

        6. Students living in rural communities

        7. All of these or some other measure(s).

      2. [For each of the measures that the grantee says had a minimum:] What was the minimum percentage or number of students who were [type of student disadvantage] that subgrants needed to serve to be eligible for CLSD funds?

    2. [If no minimum proportion in Q12a:] Did your state give more points or more weight to subgrant applications that proposed to serve more low-income and high-need students? [If yes, ask i; if no, skip to Q13.]

      1. If yes: What measure(s) of student “need” did you use in assigning points or weights? These could be:

        1. Students below grade-level in reading

        2. Students receiving free or reduced-price lunch

        3. Students with disabilities

        4. English learners

        5. Students living in foster care

        6. Students living in rural communities

        7. All of these or some other measure(s).

      2. [For each of the measures that the grantee used to assign points or weights:] How did you assign points or weights for [type of student “need”]?

  3. Was the coordination of programming from early childhood centers into elementary schools one of the factors you considered in making awards? [If yes, continue with a & b; if no, skip to Q14.]

    1. Did you give applications a quantitative score on this factor?

    2. How did you use this factor to make award decisions?

  4. Were subgrantees’ plans for professional development for teachers and educators one of the factors you considered in making awards? [If yes, continue with a & b; if no, skip to next question.]

    1. Did you give applications a quantitative score on this factor?

    2. How did you use this factor to make award decisions? Probe: Did you give points for higher quality professional development plans? If so, how did you determine the quality of the PD plans?

  5. Were subgrantees’ plans for evaluating the success of the resources and activities funded by the subgrant one of the factors you considered in making awards? [If yes, continue with a & b; if no, skip to next question.]

    1. Did you give applications a quantitative score on this factor?

    2. How did you use this factor to make award decisions? Probe: Did you give points for higher quality evaluation plans?



Section III. Approach to Using and Supporting the Use of Evidence-Based Resources and Activities

Our next set of questions is intended to obtain more details on how grantees approached using and supporting the use of evidence-based resources and activities, defined in three levels in the CLSD definition of evidence-based: Strong evidence from at least one well-designed and well-implemented experimental study; Moderate evidence from at least one well-designed and well-implemented quasi-experimental study; or Promising evidence from at least one well-designed and well-implemented correlational study with statistical controls for selection bias.

  1. [If not already addressed in previous questions] When making subgrant awards or approving literacy resources or activities that subgrantees proposed, did your state focus on curricula, programs, or practices with strong, moderate, or promising evidence as defined by the What Works Clearinghouse (WWC), or did you include other ESSA tiers or information from other evidence systems?

    1. Which WWC resources did you use to identify evidence-based curricula, programs and practices? Probe re: intervention reports, practice guides, single study reviews, or WWC database.

    2. Did your state use other clearinghouses to identify programs and practices that would have met the WWC criteria for strong or moderate evidence? (Or, if clear that state did not focus on strong or moderate evidence: Did your state use other clearinghouses that have different definitions of evidence-based programs than the WWC?) If so, which clearinghouses did your state use?

  1. Thinking about your state’s approach to using and supporting the use of evidence, did your state consider the strength of evidence for all proposed literacy resources and activities or for particular curricula, programs or practices?

  2. States took different approaches to helping subgrantees select what to fund through their CLSD grant.

  1. [For states that required specific programs/practices]: Our understanding is that your state gave applicants a specific list of programs/practices from which applicants were required/encouraged to choose and that your state developed this list by [insert state-specific process from review of policy documents].

      1. [If not already addressed in previous questions] Was the level of evidence for these programs or practices a criterion for developing this list? Why or why not?

  1. [For states that did NOT require specific programs/practices]: Our understanding is that your state did not give applicants a specific list of programs/practices from which applicants were encouraged to choose. Is that right?

  1. When your state was considering applicants’ or subgrantees’ proposed programs, did your agency or someone else attempt to confirm the level of evidence? Or, did you rely on subgrantees’ assessment of whether or not the level of evidence was promising, moderate or strong?

    1. If attempted to confirm independently: Who conducted the reviews? What training or experience did they have in reviewing levels of evidence? What resources did they use to conduct the reviews?

  2. How was your state provided technical assistance to subgrantees to support their use of evidence? In your state, would you say that the main focus on selecting programs and practices with promising, moderate, or strong evidence was at the application stage or post-award? Why?

  3. Challenges: In general, what would you say were the biggest challenges your state faced in using or supporting the use of evidence-based literacy programs/practices?

  4. What suggestions do you have for supporting states or districts in using evidence in future literacy grant competitions?

Section IV. Approach to Competitive Preference Priorities

Our next set of questions is intended to obtain more details on how grantees approached CLSD competitive preference priorities related to evidence-based family literacy and personalized pathways to education.

24. How did your state approach the competitive priority to provide families with evidence-based strategies for promoting literacy?

25. How did your state approach the competitive priority to empower families and individuals to choose a high-quality education that meets their unique needs?

Section V. Successes and Challenges

Finally, I’d like to ask about examples of successes among your subgrantees and challenges that you and/or your subgrantees have encountered as part of [State’s] CLSD grant.

  1. [For prior SRCL grantees only] Are there any lessons that you learned from SRCL that you applied to CLSD? This might be strategies that worked well or did not work well, or things you wish you would have done differently that you were able to change for the CLSD grant.

  1. What challenges, if any, has your state experienced as part of overseeing and monitoring subgrantees as they implement their literacy plans for CLSD? To what extent have these challenges been related to the coronavirus pandemic?

    1. [If not already addressed in previous questions] Did any subgrantees delay implementation of CLSD? If so, was it due to the coronavirus pandemic?

    2. [If not already addressed in previous questions] Did any of the subgrantees drop out of CLSD? If so, why?

  2. Based on your knowledge of subgrantees’ activities, in what areas do you think subgrantees have been most successful in putting their application plans into practice? [If grantee needs examples, could include: Hiring the right literacy support staff through grant funding, serving the youngest kids (i.e., birth to age 3) who were not previously served, implementing newly selected curricula/programs/practices]

  3. What aspects of the CLSD program have been hardest for subgrantees to implement fully or to understand? Why? [If grantee needs examples, could use same ones as above]

  4. Have any subgrantees asked to change how they are spending CLSD funding, or any other aspects of their CLSD grant, from what they proposed in their application to you?

If yes:

    1. In what ways have they proposed to change their plan?

    2. Why have they asked to make these changes?

    3. Are these changes related to the coronavirus pandemic?

Section VI: Wrap Up

Thank you. Those are all the questions we have for you. This was an informative discussion and will be very helpful in answering some of the evaluation’s questions about states’ approaches to CLSD, approaches to making subgrants, approaches to using and supporting evidence use, and approaches to the competitive priorities.

It was a pleasure talking with you.


Established in 1946, with headquarters in Washington, D.C., the American Institutes for Research® (AIR®) is a nonpartisan, not-for-profit organization that conducts behavioral and social science research and delivers technical assis­tance, both domestically and internationally, in the areas of education, health, and the workforce. For more information, visit www.air.org.

MAKING RESEARCH RELEVANT

AMERICAN INSTITUTES FOR RESEARCH

1000 Thomas Jefferson Street NW

Washington, DC 20007-3835 | 202.403.5000

www.air.org






File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAIR Proposal
SubjectAIR Proposal
AuthorKolegas, Tracy (Bennett)
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy