GUIDELINES FOR IMLS GRANTS TO STATES FIVE-YEAR EVALUATION
Section 9134 (c) of IMLS’ authorizing legislation directs State Library Administrative Agencies (SLAAs) to “independently evaluate, and report to the Director regarding, the activities assisted under this subchapter, prior to the end of the 5-year plan.” This evaluation provides SLAAs an opportunity to measure progress in meeting the goals set in their approved Five-Year Plans with a framework to synthesize information across all state reports in telling a national story.
This guidance identifies a core set of questions for the SLAAs to use in conducting the 2018- 2022 Five-Year Evaluations that:
Highlight effective past practices;
Assess the efficacy in implementing the activities used in advancing state goals; and
Develop key findings and recommendations from evaluating the past five years for use in organizing the next Five-Year Plan.
There are three sets of questions for each SLAA. The guidance contains possible methodological choices to help each SLAA best work with an independent evaluator.
IMLS analyzes and makes public all SLAA Five-Year Evaluations. In order to do this effectively, certain information needs to be included in all evaluation reports. This is particularly important in enabling IMLS to tell federal policy makers and practitioners about what has happened at a national level. The specified format is intended to ease the burden for any party to review across multiple reports.
Documents required for the Five-Year Evaluation include a cover page (1 page), evaluation summary (2-5 pages), evaluation report (25 pages, max.), and appendices. Please follow the format specified below:
State Library Administrative Agency
Title of the evaluation
Evaluator(s) name and organizational affiliation
Date
Name of the team, branch, unit, or person commissioning the evaluation
Summarize key findings for the three retrospective and three process questions below
Briefly describe the evaluation methodology, referencing the four methodology questions below
Answer the first six questions under A. Retrospective and B. Process in order, and numbered as they are below.
Describe the methodology employed, responding to the four questions under C. Methodology, below.
A-1. To what extent did your Five-Year Plan activities make progress towards each goal? Where progress was not achieved as anticipated, discuss what factors (e.g., staffing, budget, over-ambitious goals, partners) contributed?
Organize findings around each goal of the state’s 2018-2022 Five-Year Plan
Categorize each goal as either 1) achieved, 2) partly achieved, or 3) not achieved
A-2. To what extent did your Five-Year Plan activities achieve results that address national priorities associated with the Measuring Success focal areas and their corresponding intents?
See Appendix 1 for a list of focal areas and their intents
A-3. Did any of the following groups represent a substantial focus for your Five-Year Plan activities? (Yes/No)
Library workforce (current and future)
Individuals living below the poverty line
Individuals that are unemployed/underemployed
Ethnic or minority populations
Immigrants/refugees
Individuals with disabilities
Individuals with limited functional literacy or information skills
Families
Children (aged 0-5)
School-aged youth (aged 6-17)
For the purposes of this question, a substantial focus would represent at least ten percent of the total amount of resources committed by the overall plan across multiple years.
For those who answer Yes to any of the above groups, please discuss to what extent each group was reached.
If there are important groups that did not meet the ten percent threshold or do not appear in the list above, please consider discussing these as well.
B-1. How have you used any data from the State Program Report (SPR) and elsewhere (e.g., Public Libraries Survey) to guide activities included in the Five-Year Plan?
B-2. Specify any modifications you made to the Five-Year Plan.. What was the reason for this change?
B-3. How and with whom have you shared data from the SPR and from other evaluation resources? How have you used the last Five-Year Evaluation to inform data collected for the new Five-Year Evaluation? How have you used this information throughout this five-year cycle?
C-1. Identify how you implemented an independent Five-Year Evaluation using the criteria described in the section of this guidance document called Selection of an Independent Evaluator.
C-2. Describe the types of statistical and qualitative methods (including administrative records) used in conducting the Five-Year Evaluation. Assess their validity and reliability.
C-3. Describe the stakeholders involved in the various stages of the Five-Year Evaluation. How did you engage them?
C-4. Discuss how you will share the key findings and recommendations with others.
List of acronyms
List of people interviewed
Bibliography of all documents reviewed
Copies of any research instruments used for surveying, interviewing, and/or use of focus groups
Optional output of statistical findings
Optional summaries of coding used in any qualitative analyses
Make use of administrative data on program performance. This information can be data that is reported to IMLS in the SPR or other programmatic data collected by the SLAA.
The administrative data will likely need to be supplemented with information collected from interviews, surveys, and/or focus groups.
Data also may be available from secondary documents, including contracted third- party program evaluations, studies from non-partisan entities, SLAA reports submitted to IMLS, or other reports to state policy makers.
Other sources of information, such as Census’s American Community Survey, state education data (i.e., Common Core data), and surveys conducted by the SLAA may be used to describe broad changes in communities or in the state. While these, for the most part, cannot be used for making direct attributions of outcomes from LSTA programming efforts, they can effectively describe the context of activities undertaken.
Descriptive statistics should suffice in conducting any quantitative analysis. A blend of summary tables and/or figures reporting results in the narrative is customary in this type of research. Presentation of extensive statistical output is generally reserved for appendices.
A content analysis (with potential descriptive statistics for summarizing codes) can be an acceptable method for conducting qualitative analysis. There are various types of sampling and coding strategies that will precede selecting a content analysis or other analytical choice; the independent evaluator should make these transparent in allowing you and other readers to assess the credibility of the evidence. (See below for more details on evaluation methodology and using an independent evaluator.)
The independent evaluator should clearly address these methodology questions to your satisfaction before proceeding to collect and analyze data.
The independent evaluator will need to carefully document project records used in the study. Professional guidelines for this type of research require protocols in place to ensure confidentiality and consent.
In working with the independent evaluator, other stakeholders reviewing the document should have set aside appropriate time to assure that they have enough knowledge of the scientific techniques that the evaluators will be using in collecting and analyzing data, including tradeoffs that they are making given limited resources and time.
You should include a section that summarizes the methods used in any statistical and qualitative research. For qualitative research, many types of sampling and coding strategies may be appropriate; whatever gets selected should be made transparent in this section.
The appendices should contain copies of any instruments used for data collection as well as those used in coding.
An independent evaluation is rigorous and objective (carried out free from outside influence). If a State decides to carry out the evaluation within the government, the evaluator should be able to demonstrate that it does not have a role in carrying out LSTA-funded activities and is independent of those who are being evaluated or who might be favorably or adversely affected by the evaluation results. If the evaluation is carried out within the State entity that includes the SLAA, the line of hierarchy within the entity should demonstrate that the evaluator is not accountable to those responsible for oversight of the LSTA program within the State. Regardless of whether the evaluation is done in-house or through a third party, the evaluator must be able to
demonstrate professional competency to rigorously conduct the evaluation, including requisite expertise in statistical, qualitative, and general social science research methods.
Please send an electronic version of your Five-Year Evaluation by March 30, 2022 to: [email protected]
Lifelong Learning
Improve users’ formal education
Improve users’ general knowledge and skills
Information Access
Improve users’ ability to discover information resources
Improve users’ ability to obtain and/or use information resources
Institutional Capacity
Improve the library workforce
Improve library’s physical and technology infrastructure
Improve library operations
Economic & Employment Development
Improve users’ ability to use resources and apply information for employment support
Improve users’ ability to use and apply business resources
Human Services
Improve users’ ability to apply information that furthers their personal, family, or household finances
Improve users’ ability to apply information that furthers their personal or family health & wellness
Improve users’ ability to apply information that furthers their parenting and family skills
Civic Engagement
Improve users’ ability to participate in their community
Improve users’ ability to participate in community conversations around topics of concern
OMB
Control
Number:
3137-0090,
Expiration
Date:
05/31/2019
IMLS-CLR-D-0019
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |