G2S SPR with Site Visit Checklist Part B 20220804

G2S SPR with Site Visit Checklist Part B 20220804.docx

Grants to States Program “State Reporting System

OMB: 3137-0071

Document [docx]
Download: docx | pdf

B. Collections of Information Employing Statistical Methods


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole.


The prior OMB approval of performance measures in the State Program Report (SPR) for the Grants to States (G2S) Program applies to all projects supported by each of the 59 State Library Administrative Agencies (SLAAs) through their annual formula allotment. There is no proposed sampling that applies.


Using FY 2018 data (the most recent for approved projects in the G2S Program), SLAA partners supported a total of 1,339 projects through their annual allotments. Of the associated 2,936 activities, as shown in Table 1, only 580 (20 percent) required that SLAA partners collect survey data from project beneficiaries (participants) and to include aggregated outcome data from these surveys into the SPR. This is required for activities for which an attribution of an outcome is defensible based on the beneficiary and activity mode.


Table 1. Activities Reporting Outcomes Data Based on FY 2018 Reports

Beneficiary

Activity Mode

Number of Activities Requiring Questionnaire

Public

Activity: Instruction

Mode: Program

258 Activities

Library Workforce

Activity: Instruction

Mode: Program

222 Activities

Library Workforce

Activity: Content

Mode: Acquisition or Creation

83 Activities

Library Workforce

Activity: Planning and Evaluation

Mode: Retrospective or Prospective

17 Activities

TOTALS

All Activities where attribution is defensible (surveying of participants)

580 of 2,936 total activities (20%)


Although surveying is a project reporting requirement, IMLS has heard challenges from SLAAs in obtaining a 100% response rate in surveying project participants. Project surveying depends on local project directors’ building surveys into their project designs and collecting data during the period of performance. Although SLAAs convey the requirement, it may not be feasible to reach program participants at a later time if project directors submit final reporting materials without survey data. Using FY 2018 data, as shown in Table 2, actual response rates varied by question and beneficiary type, from a low of 66% to a high of 93%, with an overall average of 79%. This result compares favorably with the 68% rate observed in a nearly identical voluntary survey questionnaire already implemented in public libraries through the Public Library Association’s “Project Outcome.”


Table 2. Response Rates for Activities Reporting Outcomes Data Based on FY 2018 Reports


Beneficiary

Activity Mode

Actual Response Rate for

Activities Using Questionnaire

Public

Activity: Instruction

Mode: Program

240/258 Activities (93%)

Library Workforce

Activity: Instruction

Mode: Program

146/222 Activities (66%)

Library Workforce

Activity: Content

Mode: Acquisition or Creation

62/83 Activities (75%)

Library Workforce

Activity: Planning and Evaluation

Mode: Retrospective or Prospective

12/17 Activities (71%)

TOTALS

All Activities where attribution is defensible (surveying of participants)

460 of 580 total activities (79%)


The SPR system uses no statistical sampling or generalization; therefore, the following do not apply: statistical methodology for stratification and sample selection, estimation procedure, degree of accuracy needed for the purpose described in the justification, unusual problems requiring specialized sampling procedures, and any use of periodic data collection cycles to reduce burden (less frequently than annually).


2. Describe the procedures for the collection of information.


The procedures for SLAA reporting project information into the SPR have not changed from that previously approved by OMB (3137-0071). SLAA grantees gather participant responses to Likert-scale questions and then enter the aggregated results of these questions into the SPR. System auto-generation of tabulated percentages ensure reliability and reduce respondent reporting burden.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield reliable data that can be generalized to the universe studied.


This administrative collection is not intended for statistical generalizations and involves no sampling. The collected information through overall project reporting, which has a 100% response rate, improves grantee accountability and fosters sharing of information among library service practitioners and policy makers on the details of the SLAAs’ federal taxpayer supported projects.


As noted above, those projects with surveying of participants for outcome measures had an average response rate of 79% based on the most recent available data. IMLS will continue to encourage higher response rates through annual report feedback to individual states, through educational approaches at the annual all-states conference, and through deeper monitoring site visit conversations every five years. SLAAs also share strategies for increasing survey response rates through regular meetings and listserv conversations.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents.


The SPR is organized on a logic framework, centered around 13 national objectives, each with associated descriptive and outcome metrics for characterizing the projects per the purposes specified in IMLS’s federal statute (20 U.S.C. § 9121). IMLS and SLAA participants constructed the logic framework, and IMLS subsequently verified key elements through review of the social scientific literature and feedback from peer evaluators.1 OMB originally approved the main elements of the SPR system in 2012, including the descriptive metrics with all states and territories beginning reporting into the SPR in Winter 2015 for the FY 2014 G2S reporting period. OMB then approved IMLS’s request in 2017 to introduce surveying of project participants for those projects where attribution is defensible using the SPR program logic.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



Name

Contact Number


Title


Organization


Role in this Study

Matthew Birnbaum, Ph.D.

(202) 653-4760

Director

IMLS, Office of Research and Evaluation

Former COR for development of SPR, principal lead for performance measurement



1 Bryson, J.B. 2004. Strategic Planning for Public and Nonprofit Organizations: A Guide to Creating and Sustaining Organizations. 3rd Edition. San Francisco, CA: Jossey Bass. Farrior, M. (2005). Breakthrough Strategies for Engaging the Public: Emerging Trends in Communications and Social Science. Retrieved February 1, 2012 from http://www.biodiverse.org/docs/publicationsandtipsheets/breakthroughstrategiesforengagingthepublic.pdf; Dillman. D. A. 2007. Mail and Internet Surveys: The Tailored Design Method. Second Edition. Hoboken, NJ: John Wiley & Sons, Inc. Fowler, F. J., Jr. 2002. Survey Research Methods. 3rd Edition. Thousand Oaks, CA: Sage Publications. Hatry, H., Morley, E. and Marshall, M. 2010. Performance Management Plan Information for Institute for Museum and Library Services. Washington, DC: Urban InstituteWholey, J.S., Hatry, H. and Newcomer, K. 1010. Handbook of Practical Program Evaluation. 3rd Edition. San Francisco, CA: Jossey Bass. Birnbaum, M., Okahara, K. and M. Warner. 2012. Advances in Librarianship. “Changes in Library Evaluation: Responding to External Pressures in the Institution of Museum and Library Services’ Measuring Success Initiative for the Grants to States Program.” Vol. 30, pp. 3-27.


In addition, IMLS staff consulted initially with peer evaluators in six federal agencies and the Urban Institute. It then completed a second round of peer review at Rutgers University’s annual conference on performance measurement and reporting (September 19, 2014) and completed a third round of peer reviews at the annual meetings of the American Evaluation Association in 2014 and 2015.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEmmalou Norland
File Modified0000-00-00
File Created2022-08-08

© 2024 OMB.report | Privacy Policy