IMLS Museums for America Evaluation Study - Justification B72710FINAL

IMLS Museums for America Evaluation Study - Justification B72710FINAL.doc

Museums for America Grant Program Evaluation

OMB: 3137-0079

Document [doc]
Download: doc | pdf

IMLS Museums for America Evaluation Study


Part B. Collections of Information Employing Statistical Methods


B1 Potential Respondent Sampling and Selection Methods


B1.1 Online Survey

The survey is designed so that respondents complete only the portions most relevant to the study of the recent application process and/or single grantee project activities and effects. A total of 1011 institutions will be surveyed to complete sections of the survey related to either or both the Application Study and Projects Activities and Effects Study.


The project will survey the entire universe of institutions that applied for Museums for America grants for the 2008 to 2010 funding cycles (approximately 644 institutions) about the application process. Depending on whether IMLS funded all applications during this time period, museums will be asked to complete either one or two sections in this study.


Institutions with funded projects in process for at least a year or completed MFA projects (approximately 761 institutions) will be asked to complete between one and three sections on project implementation and effects. A group of 394 institutions overlap with the application study. The first section asks museums to report on the activities and status of the project. Grantees with completed projects (589 institutions) will complete a second section on short term effects from the MFA project. The third portion is only for those museums who completed a MFA project three or more years ago (304 institutions) and it addresses possible long term effects on the institution or community.


B1.2 Case Studies

Museums for case studies will be selected from the approximately 304 grantee institutions with completed projects. Case study sites will be selected that speak to the variety of museums’ grant experiences. For example, we will look for museums that illustrate different disciplines and sizes, that operate in urban or rural settings, and that serve special populations such as children and youth. A further factor in case study selection will be the continued involvement with the institution of project directors and staff. It is anticipated that at least one case study will be conducted in each of the three grant categories (Serving as Centers of Community Engagement, Sustaining Cultural Heritage, and Supporting Lifelong Learning).

The case studies will be used to explore the ways in which IMLS and MFA funding, and perceptions about such funding, affected museums’ fulfillment of their mission and their ability to build internal capacity, to extend audience reach and engagement, and to form effective partnerships.


B2 Procedures for Collecting Information


B2.1 Online Survey

A mailed letter to each museum will encourage survey participation and ask the recipient to direct the request to the most appropriate and knowledgeable staff member(s) to complete the survey. Through its online format, the survey will accommodate repeat visits by respondents to ensure flexible use of their time and will allow respondents to change inputted data up to their final completion of the survey. The use of pre-populated data on unfunded grants, the number of funded grants, and estimated completion date within the online survey will direct the respondent to the appropriate sections of the survey. Respondents will see their application and funded grant descriptions displayed online which will aid in recall. All respondents will be asked about only one funded project. Grantees with a project completed three or more years ago will be asked three sections of the survey including the project activities, short term effects, and long term effects of that project. Those with a project completed two years or less will answer questions on the project activities and only the short term effects. Museums with funded projects in progress for at least one year will be asked questions regarding only the project activities. Information indicators will be embedded in the survey if additional definitions of terms are needed. Data validation checks, such as only allowing for one response or ensuring for question-skip patterns, will be coded into the survey. Upon completion of the survey data collection, the web-based data will be imported into PASW (formally SPSS software) files for analysis.


B2.2 Case Studies

In preparation for the visits, selected case study organizations will receive a letter informing them of the study and requesting their participation. Follow-up telephone calls to arrange the local site visit will provide background on the project, seek additional information on organizations and partners in order to identify key respondents, and determine the best timing for the site visit to accommodate local respondents’ schedules.

A series of interviews will be conducted at each site with key people who were involved in carrying out the project or who were served by it—for example, institution leadership, project directors and key staff, partners, and community members. A maximum of five staff members at each institution, as well as one to six partners or community members will be engaged in data collection activities. Individual interviews lasting approximately one hour are planned for these, with the exception of multiple partner or community interviews which will be conducted in a focus group format also lasting approximately one hour.

The case study site visits will be conducted by two-person teams of RMC staff members experienced in field-based qualitative research and semi-structured interviewing of the type that will be used in this study. All researchers involved in the fieldwork will be trained with respect to the objectives of the study, study procedures and protocols, and the types of information sought for the study.


B3 Response Rates and Non-Responses


B3.1 Online Survey

Because RMC will obtain current contact data from IMLS, non-respondents will be easily identified. In order to minimize non-responders, the Institute will send an announcement informing all applicants (this includes grantees) of the survey and the importance of participating in it.

Email reminders will be sent to non-responding applicants and telephone contact will be made with non-responding grantees. Because RMC will be available to answer any questions respondents have about the survey, the burden of responding should be minimal.


B3.2 Case Studies

RMC will identify approximately 20 museums that would be suitable for participation in the case study and contact six museums within this pool of 20. Participation will be voluntary and based on selection criteria and on museums’ willingness to participate and ability to schedule and accommodate site visits. Because six organizations represent less than two percent of all grantee institutions with completed projects (approximately 304 institutions), we do not anticipate failing to secure agreement for case study site visits from six museums.


B4 Tests of Procedures or Methods


B4.1 Online Survey

The survey will be pilot-tested with four applicants and four grantees once its online development is completed. The pilot testing will ensure survey clarity, ease of completing the survey, and a smooth electronic process for administering the survey. RMC will adjust the survey to resolve any technological issues the pilot test reveals. RMC will also conduct brief interviews with the respondents who pilot tested the online survey to ensure that all comments, suggestions, and issues associated with the online survey are understood and resolved. Since all pilot testing respondents would complete the survey in any event, no additional burden will be placed on them by agreeing to help. The survey itself was developed iteratively in consultation with the IMLS.



B5 Contact Information for Statistical or Design Consultants

Three professional staff members from the Portsmouth, New Hampshire office of RMC Research Corporation will lead the IMLS Museums for America Evaluation Study:

  • Alice Apley, Senior Research Associate, will serve as Study Director.

  • Kim Streitburger, Senior Research Associate, will be responsible for administration management and data analysis.

  • John Parsons, Information Services Specialist, will work with the team to design the online survey and database and will serve as technical assistant.

3


File Typeapplication/msword
File TitleIMLS Museums for America Evaluation Study
AuthorKMILLER
Last Modified Byepastore
File Modified2010-07-22
File Created2010-07-22

© 2024 OMB.report | Privacy Policy