Guidelines for IMLS Grants to States Five-Year Evaluation

Guidelines for Grants to States Program Five-Year Evaluations

IMLS Guidelines for G2S Five-Year Evaluation 20220911

Guidelines for IMLS Grants to States Five-Year Evaluation

OMB: 3137-0090

Document [docx]
Download: docx | pdf

GUIDELINES FOR IMLS GRANTS TO STATES FIVE-YEAR EVALUATION

Purpose of the Evaluation

Section 9134 (c) of IMLS’s authorizing legislation directs State Library Administrative Agencies (SLAAs) to “independently evaluate, and report to the Director regarding, the activities assisted under this subchapter, prior to the end of the 5-year plan.” This evaluation provides SLAAs an opportunity to measure progress in meeting the goals set in their approved Five-Year Plans with a framework that synthesizes information across all state reports in telling a national story.

This guidance identifies a core set of questions for the SLAAs to use in conducting the 2023- 2027 Five-Year Evaluations that:

  • Highlight effective past practices;

  • Assess the efficacy of implementing the activities used in advancing state goals; and

  • Develop key findings and recommendations from evaluating the past five years for use in devising the next Five-Year Plan.

There are three sets of questions for each SLAA. The guidance contains possible methodological choices to help each SLAA best work with an independent evaluator.

Format and Questions

IMLS analyzes and makes public all SLAA Five-Year Evaluations. To do this effectively, certain information needs to be included in all evaluation reports. This is particularly important to enable IMLS to tell federal policymakers and practitioners about what has happened at a national level. The specified format is intended to ease the burden for any party to review across multiple reports.

Documents required for the Five-Year Evaluation include a cover page (1 page), evaluation summary (2-5 pages), evaluation report (25 pages, max.), and appendices. Please follow the format specified below:

Cover Page (1 page)

  • State Library Administrative Agency

  • Title of the evaluation

  • Evaluator(s) name and organizational affiliation

  • Date

  • Name of the team, branch, unit, or person commissioning the evaluation

Evaluation Summary (2-5 pages)

  • Summarize key findings for the three retrospective and five process questions described in sections A and B.



  • Briefly describe the evaluation methodology, referencing the three methodology questions described in section C.

Evaluation Report (25 pages, max., excluding appendices)

  • Answer the first eight questions under sections “A. Retrospective” and “B. Process” in order and numbered as they are below.

  • Describe the methodology employed, responding to the three questions under “C. Methodology,” below.

A. Retrospective Questions

A-1. To what extent did your Five-Year Plan activities make progress towards each goal? Describe the factors that contributed to the outcome (e.g., staffing, budget, partners).

Organize findings around each goal of the state’s 2023-2027 Five-Year Plan

  • Categorize each goal as either 1) achieved, 2) partly achieved, or 3) not achieved.

A-2. To what extent did your Five-Year Plan activities achieve results that address national priorities associated with the Measuring Success focal areas and their corresponding intents?

The focal areas are:

  • Lifelong Learning

  • Information Access

  • Institutional Capacity

  • Employment & Economic Development

  • Human Services

  • Civic Engagement

See Appendix 1 for a list of intents for each focal area.

A-3. Did any of the following groups represent a substantial focus for your Five-Year Plan activities? (Yes/No)

  • Library workforce (current and future)

  • Individuals living below the poverty line

  • Individuals that are unemployed/underemployed

  • Individuals from racial or ethnic minority populations

  • Immigrants/refugees

  • Individuals with disabilities

  • Individuals with limited functional literacy or information skills

  • Families

  • Children (aged 0-5)

  • School-aged youth (aged 6-17)

For this question, a substantial focus would represent at least ten percent of the total amount of resources committed by the overall plan across multiple years.

For those who answer Yes to any of the above groups, please describe how each group was reached.

If there are important groups that did not meet the ten percent threshold or do not appear in the list above, please consider describing these as well.

B. Process Questions

B-1. How has the SLAA used any data from the State Program Report (SPR) and elsewhere (e.g., Public Libraries Survey) to guide activities included in the Five-Year Plan?

B-2. Specify any modifications the SLAA made to the 2023-2027 Five-Year Plan and specifically changes to goals. What was the reason for this change?

B-3. How and with whom has the SLAA shared data from the SPR and other evaluation resources?

B-4. How has the SLAA used the 2018-2022 Five-Year Evaluation to inform data collected for the 2023-2027 Five-Year Evaluation? How has the SLAA used this evaluation throughout this five-year cycle?

B-5. Discuss how the SLAA will share the key findings and recommendations from this evaluation with others.

C. Methodology Questions

C-1. Describe how the Five-Year Evaluation was implemented independently using the criteria described in the section of this guidance document called Selection of an Independent Evaluator.

C-2. Describe the types of statistical and qualitative methods (including administrative records) used in conducting the Five-Year Evaluation. Describe their validity and reliability.

C-3. Describe the stakeholders involved in the development of the various stages of the Five-Year Evaluation and how they were engaged?

Appendices

    • List of acronyms

    • List of people interviewed

    • Bibliography of all documents reviewed

    • Copies of any research instruments used for surveying, interviewing, and/or use of focus groups

    • Optional output of statistical findings

    • Optional summaries of coding used in any qualitative analyses





Evaluation Strategies

Retrospective and Process Questions (A-1 through A-3, and B-1 through B-5, above)

    • Make use of administrative data on program performance. This information can be data that is reported to IMLS in the SPR or other programmatic data collected by the SLAA.

    • The administrative data will likely need to be supplemented with information collected from interviews, surveys, and/or focus groups.

    • Data also may be available from secondary documents, including contracted third-party program evaluations, studies from non-partisan entities, SLAA reports submitted to IMLS, or other reports to state policymakers.

    • Other sources of information, such as Census’s American Community Survey, state education data (i.e., Common Core data), and surveys conducted by the SLAA may be used to describe broad changes in communities or in the state. While these, for the most part, cannot be used for making direct attributions of outcomes from LSTA programming efforts, they can effectively describe the context of activities undertaken.

    • Descriptive statistics should suffice in conducting any quantitative analysis. A blend of summary tables and/or figures reporting results in the narrative is customary in this type of research. Presentation of extensive statistical output is generally reserved for appendices.

    • A content analysis (with potential descriptive statistics for summarizing codes) can be an acceptable method for conducting qualitative analysis. There are various types of sampling and coding strategies that will precede selecting a content analysis or other analytical choice; the independent evaluator should make these transparent in allowing you and other readers to assess the credibility of the evidence. (See below for more details on evaluation methodology and using an independent evaluator.)

Methodology Questions (C-1 through C-3, above)

    • The independent evaluator should address these methodology questions to your satisfaction before proceeding to collect and analyze data.

    • The independent evaluator will need to carefully document the collection and use of project records used in the study. Professional guidelines for this type of research require protocols in place to ensure confidentiality and consent.

    • In working with the independent evaluator, the SLAA should ensure that they have enough knowledge of the evaluation methods for collecting and analyzing data, including tradeoffs that the evaluator is making given limited resources and time.

    • The independent evaluator should include a section that summarizes the methods used in any statistical and qualitative research. For qualitative research, many types of sampling and coding strategies may be appropriate; whatever gets selected should be made transparent in this section.

    • The appendices should contain copies of any instruments used for data collection as well as those used in coding.

Selection of an Independent Evaluator

An independent evaluation is rigorous and objective (carried out free from outside influence). If a State decides to carry out the evaluation within the government, the evaluator should be able to demonstrate that it does not have a role in carrying out LSTA-funded activities and is independent of those who are being evaluated or who might be favorably or adversely affected by the evaluation results. If the evaluation is carried out within the State entity that includes the SLAA, the line of hierarchy within the entity should demonstrate that the evaluator is not accountable to those responsible for oversight of the LSTA program within the State. Regardless of whether the evaluation is done in-house or through a third party, the evaluator must be able to demonstrate professional competency to conduct the evaluation objectively, including requisite expertise in statistical, qualitative, and general social science research methods.

Submitting Your Report

Please send an electronic version of your Five-Year Evaluation by March 31, 2027 to [email protected].



Appendix 1: Measuring Success Focal Areas and Intents

  • Lifelong Learning

    • Improve users’ formal education

    • Improve users’ general knowledge and skills

  • Information Access

    • Improve users’ ability to discover information resources

    • Improve users’ ability to obtain and/or use information resources

  • Institutional Capacity

    • Improve the library workforce

    • Improve the library’s physical and technology infrastructure

    • Improve library operations

  • Economic & Employment Development

    • Improve users’ ability to use resources and apply information for employment support

    • Improve users’ ability to use and apply business resources

  • Human Services

    • Improve users’ ability to apply information that furthers their personal, family, or household finances

    • Improve users’ ability to apply information that furthers their personal or family health & wellness

    • Improve users’ ability to apply information that furthers their parenting and family skills

  • Civic Engagement

    • Improve users’ ability to participate in their community

    • Improve users’ ability to participate in community conversations around topics of concern



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorConnie Bodner
File Modified0000-00-00
File Created2022-09-19

© 2024 OMB.report | Privacy Policy