MAP_Part A_v4

MAP_Part A_v4.docx

Museum Assessment Program Evaluation

OMB: 3137-0106

Document [docx]
Download: docx | pdf


Approval is requested to conduct information collection for

Museum Assessment Program Evaluation




Section A. Justification


A.1. Necessity of the Information Collection


The Institute of Museum and Library Services (IMLS) funds the Museum Assessment Program (MAP) through a Cooperative Agreement with the American Alliance of Museums (AAM). The current Cooperative Agreement in the amount of $1,414,160 (with cost share of $1,473,300) is for a three year period from FY2014-2016. Each year’s federal funding averages $471,500, which supports the participation of approximately 100 museums annually. To participate in MAP, museums submit an application. Upon acceptance into the program, museums undertake a self-study phase, followed by an onsite visit from a peer reviewer, who provides a written report with recommendations for improvements. Three types of assessments are offered: Organizational; Community Engagement; and Collections Stewardship. The full parameters of MAP and the obligations of each party are outlined in the Cooperative Agreement.


The proposed Museum Assessment Program Evaluation is budgeted at $26,000. Per the FY2014-2016 Cooperative Agreement, AAM will develop a summative evaluation instrument …employing a mix of qualitative and quantitative methods. The summative evaluation will ascertain the extent to which the MAP objectives have been met, and re-measure museums’ perceptions about how the program has informed their practice and influenced their operations. AAM has also conducted evaluations for each step of the program: application, self-study, post report and implementation. Peer reviewers also fill out an evaluation. These evaluations are used to gather testimonials about the program and make adjustments to help the program run more smoothly.


IMLS is responsible for identifying national needs for and trends in museum and library services. As noted in the legislative authority section below, IMLS must also report on the impact and effectiveness of programs conducted with federal funds and disseminate information on the best practices of these programs.


About IMLS

The Institute of Museum and Library Services (IMLS) is the primary source of federal support for the nation's 123,000 libraries and 35,000 museums. IMLS' mission is to create strong libraries and museums that connect people to information and ideas. IMLS works at the national level and in coordination with state and local organizations to sustain heritage, culture, and knowledge; enhance learning and innovation; and support professional development.


As detailed in the IMLS Strategic Plan 2012-2016, IMLS is committed to promoting inclusive and accessible learning services for the American people and is uniquely positioned to facilitate and highlight the work that libraries and museums do in addressing a wide variety of learning needs and providing services to increasingly diverse communities. IMLS plays a leadership role in promoting inclusive services that address the needs of the increasingly diverse populations of our country as well as the accessibility requirements of all users. Further, IMLS promotes museums and libraries as strong community anchors that enhance civic engagement, cultural opportunities, and economic vitality.


About AAM:

The American Alliance of Museums has been bringing museums together since 1906, helping to develop standards and best practices, gathering and sharing knowledge, and advocating on issues of concern to the entire museum community. With nearly 25,000 individual, more than 4,000 institutional and 300 corporate members, the Alliance is dedicated to ensuring that museums remain a vital part of our communities, connecting people with the greatest achievements of the human experience, past, present and future.


About MAP:

Since its inception in 1981, the Museum Assessment Program (MAP) has helped over 4,600 small and mid-sized museums of all types strengthen operations, plan for the future and meet standards. Through a one-year process of self-assessment and peer review, MAP helps museums become more sustainable and professional, build capacity, and better serve their communities. Museums can choose from one of three assessment types: Organizational, Collections Stewardship, or Community Engagement. Each assessment requires participants to engage in an intensive, guided self-study, completion of a workbook, and a 1-2 day site visit by a peer reviewer. A written report with recommendations for improvement and implementation is provided at the end of the process and some museums also partake in a follow-up site visit a year later. As part of this program, AAM provides museums with self-study materials, resources and tools, live webinars, and access to a peer museum professional. After participating in MAP, museums have reported positive organizational changes including stronger alignment with strategic plans, improved museum policies and procedures, and more effective fundraising. Approximately 100 museums participate in a MAP assessment each year.



Legislative authority



20 U.S.C. Subchapter III – Museum Services

Section 9173. Museum services activities

  1. In general

Subpart 7: supporting museums in providing services to people of diverse geographic, cultural, and socioeconomic backgrounds and to individuals with disabilities

Subpart 8: supporting museums in developing and carrying out specialized programs for specific segments of the public, such as programs for urban neighborhoods, rural areas, Indian reservations, and State institutions


20 U.S.C. Section 9108. Policy research, analysis, data collection, and dissemination


(a) In general

The Director shall annually conduct policy research, analysis, and data collection to extend and improve the Nation’s museum, library, and information services.


(b) Requirements

The policy research, analysis, and data collection shall be conducted in ongoing collaboration (as determined appropriate by the Director), and in consultation, with –

(1) State library administrative agencies;

(2) National, State, and regional library and museum organizations;

(3) Other relevant agencies and organizations.


(c) Objectives

The policy research, analysis, and data collection shall be used to –

(1) Identify national needs for and trends in museum, library, and information services;

(2) Measure and report on the impact and effectiveness of museum, library, and information services throughout the United States, including the impact of Federal programs authorized under this chapter;

(3) Identify best practices; and

(4) Develop plans to improve museum, library, and information services of the United States and to strengthen national, State, local, regional, and international communications and cooperative networks.  


(d) Dissemination

Each year, the Director shall widely disseminate, as appropriate to accomplish the objectives under subsection (c), the results of the policy research, analysis, and data collection carried out under this section.



A.2. Purposes and Uses of the Data


IMLS is working with the American Alliance of Museums (AAM), per the FY2014-2016 Cooperative Agreement, to conduct a program evaluation study to understand the extent to which MAP has contributed to the professionalization and capacity development of museums. Results will be used for administrative/managerial and benchmarking purposes.


The following questions were designed to frame this proposed evaluation study:

  1. Did MAP participation help build a museum’s institutional capacity, increase its professionalism, and strengthen its organizational performance in the areas of overall operations, leadership, collections stewardship, community engagement? How and why, or why not?

  • What were the fundamental capacities built, and other institutional changes that happened due to participation in MAP?

  • What were factors (either associated with the museum or the program structure) that most contributed to or impeded this capacity building and the development of a more professional organization?

  1. How soon did positive contributions from MAP come to fruition?

  2. Does assessment type have any relationship to the or timing of the results?

  3. What changes could be made to the organization and structure of the Museum Assessment Program itself to further increase user satisfaction and to achieve the overall program goals?

  4. What examples exist to illustrate the longitudinal contributions of MAP participation? Are there examples of institutional success and best practices from museums that have participated in MAP?

  5. Are there differences in the longitudinal contributions to participating museums depending on the type of assessment?


Data for this study will be collected in two phases. First, past MAP museum participants (from approximately 2006 to 2014) will be invited to complete an online survey that will explore the attitudes and values they ascribed to their MAP experience during and after participating. during. The nine year sample is intended to provide a long enough period of time to capture trends and longitudinal data (see more in A.4 below) To add context to this information, participants will be required to provide basic-level information about their institutions (e.g., institution type, operating budget, and size of staff—paid, unpaid, and volunteer). This information will be used primarily during analysis to segment the data and help understand the degree to which institution type affects results. Personal data (i.e., institution name, contact name, telephone number, and email) will only be collected voluntarily if the museum agrees to participate in a follow-up telephone interview. Personal information will be treated confidentially by the third-party evaluators. Follow-up interviews will explore MAP experiences of individual institutions and gather further details on the circumstances of the participating museum, its operation, and its successes and challenges; nuances beyond what was shared in the online survey. This information will contribute to the development of individual case studies that will address Question 5 of the evaluation study (above) and showcase the contribution of MAP to their institution’s capacity, performance, and professionalism.


Information collected will be used by four different audiences:


Internal Audiences:

  1. AAM and IMLS: To assess the efficacy of the program to support its continual evolution and improvement and to help communicate the value of MAP to the professional museum community.

External Audiences:

  1. Policy makers: To show the results of federal dollars spent on the development, implementation, and management of MAP

  2. Current MAP participants: To promote ongoing engagement with the program and provide inspiration and examples for how to use and maximize the MAP experience

  3. Museum field: To illustrate the adoption of promising practices in the application of the MAP experience and to encourage future applications to the program


The final evaluation report and case studies will also be posted on the IMLS and AAM websites. We anticipate the final report will include the following sections: executive summary; introduction to and brief history of the MAP program; evaluation study goals; methodology; summary and analysis of findings for each question (overall and differentiated by assessment and museum type/size); holistic analysis of the fundamental capacities MAP has built and the degree of MAP’s contribution; program recommendations; and appendices.


The case studies will be 3-5 pages long and follow a standardized format that will include, but not be limited to: basic museum profile (i.e., budget, staff size, location, governance type, assessment type(s) completed and date); organizational landscape (i.e. strengths, weaknesses, opportunities); the role of assessment in institutional change and MAP-related implementation strategies; organizational challenges, and lessons learned. Case studies will highlight a variety of museum types and represent a range of programs/successes that can be attributed to MAP participation.


Case studies are intended to be individual examinations of scenarios that will help better understand the experiences of selected institutions and their engagement with MAP. (There is no intention to conduct comparative qualitative analysis across case studies.)



A.3. Use of Information Technology


IMLS and AAM will create an online survey to simplify the data collection process. AAM will vet email addresses of past MAP participants to ensure that a final contact list is valid. Should participants require, a simple printed form of the online survey will be provided for those few museums unable to utilize the electronic process.



A.4. Efforts to Identify Duplication


This proposed evaluation study builds on the last summative program evaluation conducted in 2009, which included data from 194 museums that had completed a MAP between 2003 and 2009. Approximately 575 more museums have completed MAP between 2009 and 2014. This evaluation study will survey approximately 850 museums that completed a MAP assessment between 2006 and 2014. There are several reasons for using this timeframe and including some of the same museums as previously surveyed:


  • The 2006 has been selected as the starting year due to the fact that the structure of the MAP program was significantly changed at that time—from individual grants to museums to a single grant award to AAM to administer a participant program.


  • To collect longitudinal data. Institutional change and capacity building, especially at smaller museums (which is MAP’s core audience), takes time and therefore longitudinal data is needed. This evaluation will help differentiate between short and long term institutional changes influenced by MAP, and if short vs. long term results were affected by significant program changes made since 2006. Also, 14% of museums repeat MAP (different or same assessment) and it will be important to understand if multiple assessments are a factor and if institutional changes are tied to a specific assessment or are a result of cumulative experiences.


  • Difference in focus: The proposed evaluation study will focus on the contributions of MAP participation on organizational improvements for individual museums in overall operations, community engagement, and collections stewardship. The previous 2009 study emphasized the effect of the MAP program on the professional museum community in terms of an understanding and application of standards and best practices by participating museums.


Some previously used survey questions that have yielded useful data will be incorporated into this study. However, it is not expected that participants will feel overly burdened by this duplication due to the time passed since these questions were previously asked. (While the majority of questions for this 2016 study are unique and specifically designed to address the project goal of assessing the contribution of MAP participation to institutional change and professional capacity building, 15 questions (four of which are demographic questions) from earlier studies were incorporated into this survey in order to adhere to existing AAM member segmentation and provide data continuity.) Also, overall the repeat group represents a small number (130) in the universe of survey recipients planned for the current survey. Further, the repeat group of questions provides a quality control check to ensure the reliability of the collected data.


A.5. Methods Used to Minimize Burden on Small Businesses


For all institutions, both small and large, participation in Museums Assessment Program (MAP) Evaluation is entirely voluntary.

A.6. Consequences of Less Frequent Data Collection


The Museum Assessment Program aims to continuously innovate in an effort to provide participating institutions with the highest-quality materials, guidance, and support that enable them to be successful in organizational improvement. Without implementation of this data collection, this program would stagnate. Not only would it become irrelevant to participants, AAM would not have the potential to expand the program and meet the needs of evolving museums. IMLS also seeks to have current data to help justify its federal funding requests.


A.7. Special Circumstances


No special circumstances require the collection to be conducted in a manner inconsistent with the guidelines in 5 CFR 1320.6.


A.8. Consultations Outside the Agency


Public comments solicited through Federal Register


IMLS published a notice in the Federal Register with a 60-day public comment period to announce this proposed information collection on July 10, 2015 (FR vol. 80, No. 132, pgs. 39805-39806). A copy of the Federal Register Notice is provided. One comment was submitted.


IMLS published a notice in the Federal Register on March, 16, 2016 (Volume 81, Number 51, page 14133-14134), with a 30-day public comment period to announce forwarding of the information collection request to OMB for approval.


Consultants outside the agency

As part of the cooperative agreement referenced above, IMLS has closely consulted with the American Alliance of Museums and external evaluation firm, Spotlight Impact, LLC. in the development of the evaluation plan, data collection instruments/forms.


A.9. Payments or Gifts to Respondents


N/A


A.10. Assurance of Confidentiality


Any personally identifiable data collected (e.g., the name of the person who responded on behalf of the museum) will be kept confidential. Any personal data associated with published work (i.e., final report, case studies) will be used only if approved by the participant. Assurances of confidentiality will be conveyed in a “consent” section at the beginning of the survey instrument.


A.11. Justification for Sensitive Questions


There are no sensitive questions on the Museums Assessment Program (MAP) Evaluation forms.


A.12. Estimates of Hour Burden to Respondents


The total number of respondents anticipated during the program is 309. The burden per respondent is estimated to be an average of 30-minute (maximum) for the online survey (300 respondents) and a 60-minute (maximum) for the telephone interview (9 respondents). The estimated total annual burden is 159 hours.


A.13. Estimates of Annualized Cost Burden to Respondents


According to the Department of Labor, the mean annual wage rate for a museum technician is $21.31 based on full time work, 40 hours/week, 52 weeks/year (May 2014):


 

No. of Respondents

Annual Frequency per Response

Estimated Hours per Response

Total Hours

Hourly Rate per Respondent

Total Cost

MAP Online Survey

300

1

0.5

150

$ 21.31

$ 3,196.50

Telephone Interview

9

1

1.0

9

$ 21.31

$ 191.79

TOTALS

309

1

1.5

159

$ 21.31

$ 3,388.29


The Estimated Total Cost Burden is $3,388.29 (159 hours burden times $21.31 average wage).



A.14. Estimates of Annualized Cost to Federal Government


The cost of the cooperative agreement with Spotlight Impact, LLC is $26,000. Most of this cost is for program development, implementation, and management for the evaluation study. Approximately $7,000 will be spent on data collection efforts.


A.15. Reason for Program Changes or Cost Adjustments


There are no changes from the OMB Form 83-I. This is a new submission.


A.16. Project Schedule


The following provides an overview of the project’s key milestones and timeline:


Project Phase

Timeframe

PROJECT DESIGN:

  • Review of MAP program and previous evaluation studies.

  • Work plan development

  • Instrument development (includes protocols, supporting materials)

Mid-July thru August 2015

PROJECT REVIEW:

  • Submit IRC package to AAM/IMLS

  • PRA clearance process

September 2015 – March 2016

PHASE ONE:

  • Data Collection – online survey (formatted, hosted)

    • ~800 invitations to yield 260-360 responses

  • Preliminary analysis to identify telephone interview subjects

March –April 2016*

PHASE TWO:

  • Data Collection –telephone interviews

April 2016

DATA ANALYSIS:

  • Complete analysis of online survey data

  • Complete analysis of qualitative interview data

  • Design case study template. Begin drafting case studies.

April – May 2016

FINAL DELIVERABLES:

  • Report Summary – study highlights and key data points

  • Full Evaluation Report – including Executive Summary, Recommendations, and Case Studies

  • Raw data files

May 2016




TOTAL

*Online survey will be open for approx. 4-6 weeks depending on response rate





A.17. Request to Not Display Expiration Date


No exemption from the requirements to display the expiration date for OMB approval of the information collection is being requested for the Museums Assessment Program (MAP) Evaluation. The OMB approval number and expiration date will be displayed on all data collection materials and documentation.


A.18. Exceptions to the Certification


No exceptions to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I apply to the Museums Assessment Program (MAP) Evaluation.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorChristopher Reich
File Modified0000-00-00
File Created2023-08-29

© 2024 OMB.report | Privacy Policy