SPR PRA Part B Final 2018-04-27

SPR PRA Part B Final 2018-04-27.docx

Grants to States Program “State Reporting System

SPR PRA Part B Final 2018-04-27

OMB: 3137-0071

Document [docx]
Download: docx | pdf

B. Collections of Information Employing Statistical Methods


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole.


The proposed changes in the State Program Report (SPR) in the Grants to States (G2S) Program applies to all projects allocated by each of the 56 State Library Administrative Agencies (SLAAs) through their annual formula allotment. There is no proposed sampling with the proposed changes applying to all eligible projects covered in IMLS’s largest grant program.


Using FY 2015 data (the most recent for approved projects in the G2S Program), SLAA partners awarded a total of 1,539 projects with their annual allotment. Of these, as shown in Table 1, only 56 percent would be required to collect survey data from project beneficiaries (participants) and then to include aggregated outcome data from these surveys into the SPR. These are for projects for which an attribution of an outcome is defensible based on the beneficiary and activity mode .


Table 1. Projects Expected to Report Outcomes Data Based on FY 2015 Reports

Beneficiary

Activity Mode

Estimated Number of

Projects Using Questionnaire

get)

Public

Activity: Instruction

Mode: Program

423 Projects

Library Workforce

Activity: Instruction

Mode: Program

215 Projects

Library Workforce

Activity: Content

Mode: Acquisition or Creation

135 Projects

Library Workforce

Activity: Planning and Evaluation

Mode: Retrospective or Prospective

94 Projects

TOTALS

All Projects where attribution is defensible

867 of 1,539 total projects (56%)


Since surveying is a project funding requirement, IMLS expects a nearly 100 percent response project rate. IMLS also expects item response rates in excess of 80 percent based on the over 95 percent response rate on an annual voluntary statistical survey IMLS administers to public libraries and a biennial voluntary statistical survey IMLS administers to state library administrative agencies as well as the 68 percent rate observed in a nearly identical voluntary survey questionnaire already implemented in public libraries through the Association of Public Library Project Outcome effort administered in collaboration with SLAA Chief Officers.


The SPR system uses no statistical sampling or generalization, therefore, the following do not apply: statistical methodology for stratification and sample selection, estimation procedure, degree of accuracy needed for the purpose described in the justification, unusual problems requiring specialized sampling procedures, and any use of periodic data collection cycles to reduce burden (less frequently than annually).



2. Describe the procedures for the collection of information.


The procedures for SLAA reporting project information into the SPR have not changed from that previously approved by OMB (3137-0071). To supplement SLAA reporting of descriptive project metrics in the SPR, IMLS proposes to add two to five new data elements to measure beneficiary outcomes in projects where attribution is defensible (as summarized in Table 1 above). SLAA grantees will gather participant responses to Likert-scale questions and then enter the aggregated results of these questions into the SPR. System auto-generation of tabulated percentages will ensure reliability and reduce respondent reporting burden.


To ensure that the approach used by the SPR in gathering data on beneficiary outcomes provides high quality data while also minimizing grantees’ reporting burden, the process includes four features.

  1. The survey instruments have been developed, tested, and validated based on nearly identical surveys already implemented voluntarily in hundreds of public libraries (who represent the largest sub-recipients in G2S) though “Project Outcome”, hosted by the Association for Public Libraries with the collaboration of the Chief Officers of SLAAs. (Appendix 1 provides a copy of the questionnaire; Appendix 2 provides a copy of SPR screen shots).

  2. SLAA participants have been clearly informed of all details of collection and utilization of their information with two annual trainings done in 2016 and 2017 and a third focusing on implementation prepared for an annual conference in 2018.

  3. The proposed data collection processes are designed to maximize response rates and minimize the respondent burden with only four required survey questions.

  4. Personal identification information will not be collected about the beneficiaries participating in the projects’ survey questionnaires. Additionally, the SPR’s public portal will not share survey results for any project with fewer than five survey participants.



In addition, IMLS staff with expertise in evaluation and survey methods, housed in its Office in Impact Assessment and Learning (OIAL), will continue to offer technical support and consultation on an as-needed basis. This will include updating resources available on its website. Appendix 3 contains copies of sample technical assistance materials.



3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield reliable data that can be generalized to the universe studied.


This administrative collection is not intended for statistical generalizations and involves no sampling. The collected information is to improve grantee accountability and to foster sharing of information among library service practitioners and policy makers on the details of the SLAAs’ federal taxpayer supported projects. Since the system was developed with active engagement of its SLAA partners, who already have successfully used the SPR system to report other information (i.e., inputs, activities, and outputs) about their projects, IMLS expects high response rates.


As noted above, the survey instruments were developed in alignment with federal and state needs for outcome performance measures with review of the relevant social science literature as well as evaluation expert feedback. The instruments were tested and validated. In addition to maximizing participation and thereby minimizing non-responsea through trainings and technical assistance, the SPR is designed to clearly communicate beneficiary response rates through tables and graphs that aggregate total response rates and valid response rates. Based on response rates of approximately 68 percent for the hundreds of public libraries already participating in Project Outcome in their states, coupled with response rates of public libraries greater than 90 percent and of state library administrative agencies greater than 95 percent in IMLS’s two voluntary statistical collections, item-response rates are expected to be in excess of 80 percent.



4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents.


The SPR is organized on a logic framework, centered around 13 national objectives, each with associated descriptive and outcome metrics for characterizing the projects per the purposes specified in IMLS’s federal statute (20 USC 72 § 9121). IMLS and SLAA participants constructed the logic framework, and IMLS subsequently verified key elements through review of the social scientific literature and feedback from peer evaluators.1 IMLS then hired a third party contractor in 2013 to build a portal (the SPR system) to implement the logic framework. OMB approved the main elements of the SPR system, including the descriptive metrics with all states and territories beginning reporting into the SPR in Winter 2015 for the FY 2014 G2S reporting period.


IMLS performed a series of tests of the proposed outcome metrics to be included in the SPR in 2016 and 2017. Agency staff trialed the new reporting fields in the SPR on two test servers with dummy data for numerous hypothetical projects. States also were provided an opportunity to trial the system with their own dummy projects around this same period of time. These tests validated the accuracy of computations in the SPR algorithms; they also led to minor programming revisions in the look and feel of the SPR dataq fields and their associated tables and charts for displaying the output.


IMLS’s third party developer already programmed these final changes into the SPR, and they are now ready for launching upon OMB approval. Appendix 2 contains copies of screen shots.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



Name

Contact Number


Title


Organization


Role in this Study

Matthew Birnbaum, Ph.D.

(202) 653-4760

Supervising Social Science Researcher

IMLS, Office of Impact Assessment and Learning

Former COR for development of SPR, principal lead for performance measurement.

Lisa M. Frehill, Ph.D.

(202) 653-4649

Senior Statistician

IMLS, Office of Impact Assessment and Learning

With Supervising Social Scientist, co-leads developing technical and training materials. Has lead in reviewing methodology, monitoring response rates, and other statistical analyes with SPR data.


1 Bryson, J.B. 2004. Strategic Planning for Public and Nonprofit Organizations: A Guide to Creating and Sustaining Organizations.  3rd Edition.  San Francisco, CA:  Jossey Bass. Farrior, M.  (2005). Breakthrough Strategies for Engaging the Public: Emerging Trends in Communications and Social Science.  Retrieved February 1, 2012 from http://www.biodiverse.org/docs/publicationsandtipsheets/breakthroughstrategiesforengagingthepublic.pdf; Dillman. D. A. 2007. Mail and Internet Surveys: The Tailored Design Method. Second Edition. Hoboken, NJ: John Wiley & Sons, Inc. Fowler, F. J., Jr. 2002. Survey Research Methods. 3rd Edition. Thousand Oaks, CA: Sage Publications. Hatry, H., Morley, E. and Marshall, M. 2010. Performance Management Plan Information for Institute for Museum and Library Services. Washington, DC: Urban InstituteWholey, J.S., Hatry, H. and Newcomer, K. 1010. Handbook of Practical Program Evaluation. 3rd Edition. San Francisco, CA: Jossey Bass. Birnbaum, M., Okahara, K. and M. Warner. 2012. Advances in Librarianship. “Changes in Library Evaluation: Responding to External Pressures in the Institution of Museum and Library Services’ Measuring Success Initiative for the Grants to States Program.” Vol. 30, pp. 3-27.


In addition, IMLS staff consulted initially with peer evaluators in six federal agencies and the Urban Institute. It then completed a second round of peer review at Rutgers University’s annual conference on performance measurement and reporting (September 19, 2014) and completed a third round of peer reviews at the annual meetings of the American Evaluation Association in 2014 and 2015.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEmmalou Norland
File Modified0000-00-00
File Created2022-08-08

© 2024 OMB.report | Privacy Policy