Museums
for All Evaluation
Section B. Collections of Information Employing Statistical Methods
B.1. Respondent Universe and Response Rate Estimation
B.1.1. Universe
The target universe for the Museums for All Evaluation is the approximately 180 physical institutions in the United States that all currently participate in the Museums for All (M4A) program and the EBT museum visitors to a select set of those museums who take advantage of the M4A ticketing program. These institutions include museums of all types, including history, art, natural history, science centers, children’s museums, botanical gardens, etc. For rationale refer to Part A.2.
B.1.2 Estimated Response Rate
Our goal is to obtain an overall response rate of approximately 80% for the participating museum survey and 100% for the follow-up telephone interviews.
B.1.3 Respondent Selection
Participating Museum Survey: ACM has an exhaustive list of current M4A participants. This list serves as the basis for the M4A museum participant universe.
Follow-up Interview: Participating museum survey respondents will opt into participating in the telephone follow-up interview. Participants will share their contact information, and the full set of museums who opt in will form the basis of the follow-up interview data collection pool.
B.1.4 Prior Data Collection
No prior evaluation-related data has been collected. Administrative data is collected quarterly by ACM to track attendance figures and stories from M4A visitors. Additionally, administrative institutional data was collected at the time of registration about each participating museum regarding their target audience, location, annual budget, etc. This information will help contextualize the data collected through this study.
B.2. Procedures for the Collection of Information
B.2.1 Design
This study employs mixed methodologies that allows researchers to combine quantitative data on participant attitudes, values, and actions with qualitative data that explore the variety and nuances of the museum’s M4A experience.
In the first phase of the study, all M4A participating museums will be invited to complete a participating museum survey [Appendix A: Participating Museum Survey]. The participating museum survey can be accessed at: https://www.surveygizmo.com/s3/3380713/Museums-For-All-Program-Key-Contact-Online-Survey
Aurora Consulting will host the online survey using SurveyGizmo® (www.surveygizmo.com), using its design engine to develop an instrument that includes open-ended, multiple-choice, scale, and Likert-type (rating) questions. The survey incorporates headings, sections, and conditional logic branching to optimize the user experience. SurveyGizmo is compatible on Macs and PCs, and are accessible through JavaScript enabled browsers (Internet Explorer 8.0 or later, Firefox 13.0 or later, Safari, 5.0 or later, and Google Chrome 16 or later). SurveyGizmo products are also accessible from mobile devices such as smart phones and tablets. At the completion of this phase of study, SurveyGizmo data will be downloaded in Microsoft Excel (.xls).
In the second phase of the study, approximately 15-18 museum survey respondents will be identified as unique M4A institutions and invited to participate in a follow-up telephone interview [Appendix B: Museum Telephone Interview]. The goal of these interviews is to further explore the nuances of their survey responses and achieve a greater understanding of their M4A experience, the characteristics of their M4A program and partnerships, the implications for their institution and community, and any relationship between these factors.
To select respondents for the interviews, Aurora Consulting and ACM will identify a subset of responding museums to the participating museum survey that best represent a diverse cross-section of survey respondents and the museums offering the M4A program. Participating museum survey responses will be filtered using the following protocol, partially based on previously collected administrative data, to select museums for the follow-up interview:
Because the interviews seek to dig more deeply into the realities of implementing Museums for All programming within various museums, viable candidates will represent 2-3 distinctly diverse M4A implementation models.
Because the interviews seek to reflect practices from the variety of types of organizations represented in the professional museum community, variation in museum types is important. Institutional profile data, collected through prior reporting, will be reviewed to ensure not all interview candidates are from the same type of organizational segments or from the same region of the country.
Since the results of implementing the M4A program may be influenced by the length of time the program has been offered, candidates will represent museums that span various lengths of time Museums for All has been offered.
An interview guide designed to encourage open-ended dialogue will be used to ensure data is collected and captured consistently. Interviewers will use a Microsoft Word template to simultaneously record participant responses during the telephone interview. Interviews will be conducted by the two individuals at Aurora Consulting that have been intimately involved in the survey design and response analysis, and are thus familiar with the M4A program and the analysis objectives. Interviews will be conducted approximately 3-4 weeks after the close of the online survey.
The purpose of these interviews is to explore a museum’s internal/external conditions and identify potential factors that support or impede an organization’s adoption and integration of an M4A program. This phase of the proposed study is intended to understand the varied nuances a single organization may experience with M4A. Data collected from interviews will be analyzed holistically to understand potential relationships between M4A participation, changes in institutional capacity, and relationships between the museum and its community. An emergent coding scheme will be used to analyze the open-ended questions across the museum survey and interview data. This emergent coding scheme and analysis will be grounded in the framework of the guiding evaluation questions.
To ensure the findings align most accurately with the Association of Children’s Museums’ and the Institute for Museum and Library Services’ understanding of the nature of the exploration of the M4A program, the emergent coding schemes across the various forms of data will be defined and discussed with the ACM and IMLS team. In this way, the emergent codes will reflect the museum staff’s perceptions of, thoughts about, and experiences with the various articulations of the M4A program while also reflecting an accurate portrayal of the M4A framework, boundaries, and intentions for the program. In particular, the emergent coding schemes will illuminate alignment, or misalignment, between the intentions of the M4A program and the lived experience of the program for museum staff.
B.2.2 Communications and Access
Upon OMB approval, Aurora Consulting will begin field operations.
ACM will contact all M4A program participants via email – explaining the importance of the study, asking for participation, and informing them to watch out for a survey invitation from Aurora Consulting to access the online survey. A maximum of two follow-up emails will be sent to participants as a reminder to non-respondents and respondents who have only partially completed the survey. As part of this survey, participants will be invited to partake in a follow-up telephone interview approximately 3-4 weeks after the close of the online survey. Those who agree will enter their contact information directly into the survey template, and their data will be added to a Microsoft® Excel database. Their survey responses will be reviewed based on the selection criteria previously described to identify whether they will be included in the follow-up interviews. The process for validating contact information, emailing, and response tracking are described below.
B.3. Methods to Secure Cooperation, Maximize Response Rates, and Deal with Non-Response
B.3.1 Sample and Contact Validation, Emailing and Tracking
Prior to the implementation of this study, ACM’s list of all M4A participants will be exported from the M4A administrative database. As a secondary measure, ACM will email all individuals on that list to inform them of the upcoming M4A study and encourage participation, thereby verifying the survey is sent to the appropriate contact information within the M4A institution.
A final, ACM-vetted contact list (of unique email addresses) will be imported into SurveyGizmo’s survey campaign engine, allowing Aurora Consulting to confidentially track individual response rates and schedule reminders for non-respondents and incomplete responders. A maximum of two reminders will be sent prior to the closure of the survey.
We expect the overall data collection period to be approximately 2-4 weeks over the course of December 2017 and January 2018.
B.3.2. Gaining Cooperation
Participating Museum Survey: As described in the previous section, ACM will email the M4A participant universe in advance of the study. This email will not only serve to validate contact information, but it also allows ACM to build anticipation for the study and encourage M4A institutions to participate. This email will be sent from ACM. [Appendix C: Museum Validation Email]
Aurora Consulting will draft an email inviting M4A museum staff to complete the online survey. This email will include consent language, survey instructions, and appropriate contact information. [Appendix D: Participating Museum Survey Invitation] This email will be co-signed by ACM Executive Director, Laura Huerta Migus, and Aurora Consulting, and it will be sent to individuals using SurveyGizmo’s survey campaign.
Museum Telephone Interviews: Aurora Consulting will employ a subtle approach for recruiting participants for telephone interviews. The online survey includes the following language: “In an effort to learn more about M4A participating institutions and their experiences, we are conducting brief telephone interviews in the coming months. This conversation will expand upon the responses you have provided here and will allow ACM to gain a better understanding of the implications of participating in the M4A program, and how the process can be improved. If you would be willing to be contacted, please provide your information below.” This process allows respondents to self-select, if they so choose, to participate in a follow-up conversation. Aurora Consulting and ACM will send candidates an email inviting them to schedule an interview time. This email will be sent from Aurora Consulting. [Appendix E: Museum Telephone Interview Invitation]
No monetary incentives or personal benefits are offered in exchange for participation in either the museum online survey or telephone interview. As such, we expect respondents will participate due to a sense of community and goodwill.
B.3.3. Technical Methods to Maximize Response Rates
For the M4A Evaluation study, it is anticipated that an overall response rate of 80% for the museum online survey and 100% for the follow-up telephone interviews. In addition to the methods described above, we will employ a number of techniques to enhance the ease-of-use of the online surveys, thus maximizing response rates for completed surveys:
Field-level Data Validation. SurveyGizmo provides field-level validation for data collected through its survey engine. These include data type validation for numeric, date, four-digit year, currency, percentage, email, and URL fields. Additionally, user selections are also validated where limitations are applied (e.g., check no more than 3 options, select top 2 items, etc.) Individual fields can be declared as required, and the required flag can respond to customized skip logic against answers provided in other fields. The survey mechanism provides immediate feedback response to any validation errors on the respondent-side when a response fails an edit, and performs server-side validation checks to ensure data integrity. Customized pop-ups help can be provided for any individual field or section.
Survey Support. In addition to the field-level online help provided by the SurveyGizmo website, content and technical support will be provided by email or phone by Aurora Consulting.
Skip Routines. Data quality will be enhanced through a set of user feedback mechanisms such as logical relationship validation checks and skip logic. Aurora Consulting has determined the pattern of skip logic required during instrument development. By employing skip routines, participants will only be asked to respond to questions that are relevant to their situation.
Progress Bar. The surveys will be presented in section-level and/or sub-section level pages, with data validation and response storage occurring on each individual page. A visual feedback mechanism will indicate the progress a user has made through the survey. This visual cue will be accompanied by a “% Complete” notification. These mechanisms set expectations throughout the survey and help participants anticipate completion of their task.
Confidentiality and Data Security. All data transferred to and from the SurveyGizmo website is protected through secure links using a Secure Socket Layer (SSL) during survey completion and encryption protocols at the disk level on database servers and at the row level, and is validated by Amazon Web Services. Data collected will be stored on Amazon Web Services servers located in their US East (VA) Region. This service is compliant with data encryption protocols and fully discloses its security policy at: https://www.surveygizmo.com/survey-software-features/security-reliability/
All data downloaded from SurveyGizmo will be in Microsoft Excel (.xls) formats. These files will be secured on Aurora Consulting computers and backed up using Dropbox for Business. Dropbox is designed with multiple layers of protection, including secure data transfer, encryption, network configuration, and application- and user-level controls that are distributed across a scalable, secure infrastructure. Dropbox files at rest are encrypted using 256-bit Advanced Encryption Standard (AES). Dropbox uses Secure Sockets Layer (SSL)/Transport Layer Security (TLS) to protect data in transit between Dropbox apps and our servers; it's designed to create a secure tunnel protected by 128-bit or higher Advanced Encryption Standard (AES) encryption. Dropbox applications and infrastructure are regularly tested for security vulnerabilities and hardened to enhance security and protect against attacks. Dropbox files are only viewable by people who have a password-protected link to the file(s). Only members of the evaluation team will have access to these data. In addition to raw data, evaluators will make available copies of data collection and analysis protocols and instruments. No personally identifiable information will be included in raw data that is disseminated. For sites that have actively agreed to be contacted for follow-up telephone interviews, that information will be made available to ACM.
Response Rate Monitoring and Reminders. Aurora Consulting will monitor response rates on a weekly basis and provide status reports to ACM. Over the collection period, reminder emails generated through SurveyGizmo will be sent to non-responders and partial responders of the participating museum survey. In the event response rates are lower than expected, ACM may decide to extend each data collection period by two weeks.
B.4. Tests to Minimize Burden
To avoid duplication of questions posed to M4A participants, Aurora Consulting reviewed the administrative data collected from participating M4A institutions and removed all duplicative survey and interview questions.
Additionally, the online survey mechanism (on SurveyGizmo) has been tested to assess the survey validation routines, skip patterns, and overall time it takes to enter survey responses. Aurora Consulting and ACM staff conducted this test, essentially test-driving the survey mechanism with dummy data. Aurora Consulting made mechanism adjustments to the web-based survey prior to submission. Reviewers may also view and test the survey at the following practice links:
Participating Museum Survey – https://www.surveygizmo.com/s3/3380713/Museums-For-All-Program-Key-Contact-Online-Survey
Museum Visitor Survey – https://www.surveygizmo.com/s3/3380655/Museums-For-All-Visitor-Survey
B.5. Individuals Responsible for Study Design and Performance
The following individuals are responsible for the study design and the collection and analysis of the data for Museums for All Evaluation.
Personnel Involved with the M4A Evaluation
Person |
Address |
Email / Phone |
Institute of Museum and Library Services Christopher J. Reich Chief Administrator, Office of Museum Services |
955
L’Enfant Plaza North, SW
|
202-653-4685 |
Matthew Birnbaum, Ph.D. Senior Evaluation Officer |
955
L’Enfant Plaza North, SW
|
202-653-4760 |
Association of Children’s Museums Brendan
Cartwright |
2711 Jefferson Davis Highway Suite 600 Arlington, VA 22202 |
202.218.7712 |
Aurora Consulting Sarah
Cohn |
1229 Tyler Street NE Suite 285 Minneapolis, MN 55413 |
612-315-4350 |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Purple highlights indicate an OMB question |
Subject | Revised per IMLS |
Author | Samantha Becker |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |