PPG - OMB Submission - B_Final

PPG - OMB Submission - B_Final.docx

Museum Capacity- Building Programs Assessment Project

OMB: 3137-0121

Document [docx]
Download: docx | pdf

Market Analysis and Opportunity Assessment of

Museum Capacity-Building Programs

Supporting Statement for PRA Submission


Part B: Description of Research/Statistical Methodology


B.1. Survey Respondent Universe and Response Rate Estimation


B.1.1. Universe


The respondent universe for the opinion survey portion of the Market Analysis and Opportunity Assessment of Museum Capacity-Building Programs will be 28,557 small-and medium-sized museums derived from the IMLS Museum Data Files. The target sample of 3,000 museums will be representative of all disciplines (art, history, natural history, botanic gardens, aquariums, zoos, children’s museums, science centers, historical societies, etc.), institutional size, regions of the country, and place types (e.g., rural, suburban, city).


Table 3 shows the museum universe stratified by the key dimensions of region, income (as a proxy for institutional size), type of museum, and place types (e.g., rural, suburban, city). More details about this stratification will be discussed in section B.1.3.


Table 3. Museum Universe

Art, History, and Natural History Museums


New England and Mid-Atlantic

Southeast

Midwest

Mountain Plains and West

Total

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

Small

169

231

211

238

134

359

179

111

297

411

196

555

3091

Medium

51

44

55

83

30

38

63

17

34

126

34

101

676

Botanic Gardens, Aquariums, and Zoos, and Nature Centers


New England and Mid-Atlantic

Southeast

Midwest

Mountain Plains and West

Total

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

Small

64

79

87

72

57

114

62

46

127

136

61

172

1077

Medium

15

24

16

20

12

12

20

6

29

42

12

43

251

Children's Museums, Science Centers, Science & Technology Museums, and Planetariums


New England and Mid-Atlantic

Southeast

Midwest

Mountain Plains and West

Total

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

Small

64

68

65

81

49

86

57

28

63

148

50

164

923

Medium

19

23

7

58

9

7

35

8

7

51

10

10

244


Table 3. Museum Universe, Cont’d.

Other Museums


New England and Mid-Atlantic

Southeast

Midwest

Mountain Plains and West

Total

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

Small

479

526

468

494

294

728

341

285

712

826

361

1327

6841

Medium

91

61

65

68

33

56

71

24

64

150

36

87

806

Historical Societies and Historic Preservation Organizations


New England and Mid-Atlantic

Southeast

Midwest

Mountain Plains and West

Total

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

Small

494

1488

1860

578

498

1368

457

809

2210

863

639

2154

13418

Medium

136

167

127

113

39

71

104

49

99

143

38

144

1230

Total

1582

2711

2961

1805

1155

2839

1389

1383

3642

2896

1437

4757

28557

*Rural includes towns, both of which are defined relative to distance from urban areas.


For the interviews and virtual focus groups, IMLS and PPG will implement purposive (non-probability) sampling from among museum stakeholders (such as those in the Official Museum Directory published in partnership with the American Alliance of Museums) and foundations and funders of non-profit capacity-building activities. Because the literature recommends that focus group synthesis and analysis consider both the individual contributions and the interaction between the group members (Kitzinger 1994; Smithson 2000; Halkier 2010; Grønkjær et al. 2011), IMLS and PPG will use careful consideration in the design of the focus groups to have a representation of participants from various types of museums, regions of the country, and location types consistent with industry best practices.


B.1.2 Estimated Response Rate


IMLS estimates the survey will have a 35% response rate based on the current literature and PPG’s experience with similar surveys. While much of the recent literature1 and many social scientists have been concerned about declining response rates for decades (National Research Council 2013),2 we feel confident museums will see the value of their participation in the study. Though many of the targeted museums are small and may have limited staff capacity, their feedback is extremely important for this study. Targeted communications, assistance from discipline-specific, regional, and state museums service organizations, and clear articulation of how this study will benefit small- and medium-size museums will enhance the response rate. IMLS and PPG will collaborate specifically with museum service organizations that work with smaller museums of different disciplines to encourage their constituents to respond to this survey. (SeeB.3.1 for an outline of the plan to enhance the response rate).


Because of IMLS’s and PPG’s reputations in the museum and capacity-building sectors, and because PPG consultants may adjust their schedules to meet the needs of interviewees, we anticipate only 2 people out of the 22 we plan to contact will decline or be unavailable. Since focus groups are not one-on-one, but require a group of individuals to be available at one time, we anticipate 6 people of the 20 we plan to contact will decline or be unavailable for one of the two virtual focus groups, based on past experience with this methodology.


B.1.3 Respondent Selection


Survey

IMLS and PPG will employ a simple stratified random sample for the opinion survey based on the four variables (outlined further in the Table 4), as follows:

  • Two (2) income groups (as a proxy for museum size): Small – those with less than $250,000 in revenues and medium – those with between $250,000 to $4,999,999 in revenues;

  • Five (5) types of museums derived from IMLS’ Museum Data Files: 1) Art, History, and Natural History; 2) Botanic Gardens, Aquariums, and Zoos; 3) Children’s Museums and Science Centers; 4) Other; 5) Historical Societies;

  • Four (4) regions of the country: 1) New England and Mid-Atlantic (CT, DC, DE, MD, MA, ME, NH, NJ, NY, PA, RI, VT); 2) Southeast (AL, AR, FL, GA, KY, LA, MS, NC, SC, TN, VA, WV); 3) Midwest (IA, IL, IN, MI, MN, MO, OH, WI); 4) Mountain Plains and West (AK, AZ, CA, CO, HI, ID, KS, MT, ND, NE, NM, NV, OK, OR,SD, TX, UT, WA, WY);

  • Three (3) place types based on National Center for Education Statistics (NCES) Urban-Centric Locale Codes classifications (city, suburb, and rural).3

The purpose of the stratification is to ensure the survey is administered to a representative sample of museums in the universe as specified in section B1.1. A simple random sample within each strata will be drawn with a target sample of 3,000 (P(selection) = 0.105). Table 4 shows the number of units to be drawn within each strata. It should be noted this is for sample selection only and not reporting purposes for which aggregation of cells will be performed to obtain comparison groups of valid sizes.


Interviewees and Virtual Focus Group Participants

A purposive sample of interviewees and virtual focus group participants will be selected, with respondents drawn from across the museum field, including individuals associated with institutions included in the IMLS Museum Data Files and the Official Museum Directory. PPG and IMLS will partner to recruit interview and focus group participants. IMLS will make primary contact, as invitations from such a well-respected institution may encourage participation. PPG will follow-up with non-respondents, schedule, and answer any questions participants may have.


Interviewees will be individuals at foundations/funders of capacity building for non-profit organizations, in general, a sector that includes museums. Some of these interviewees may be museum practitioners familiar with capacity-building offerings. The virtual focus groups will engage with a different segment of stakeholders, museum directors and staff, who will be able to assess the survey findings and discuss the challenges associated with capacity building.


Table 4. Simple Stratified Random Sample

Art, History, and Natural History


New England and Mid-Atlantic

Southeast

Midwest

Mountain Plains and West

Total

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

Small

18

24

22

25

14

38

19

12

31

43

21

58

325

Medium

5

5

6

9

3

4

7

2

4

13

4

11

71

Botanic Gardens, Aquariums, and Zoos


New England and Mid-Atlantic

Southeast

Midwest

Mountain Plains and West

Total

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

Small

7

8

9

8

6

12

7

5

13

14

6

18

113

Medium

2

3

2

2

1

1

2

1

3

4

1

5

26

Children's Museums and Science Centers


New England and Mid-Atlantic

Southeast

Midwest

Mountain Plains and West

Total

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

Small

7

7

7

9

5

9

6

3

7

16

5

17

97

Medium

2

2

1

6

1

1

4

1

1

5

1

1

26

Other Museums


New England and Mid-Atlantic

Southeast

Midwest

Mountain Plains and West

Total

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

Small

50

55

49

52

31

76

36

30

75

87

38

139

719

Medium

10

6

7

7

3

6

7

3

7

16

4

9

85

Historical Societies


New England and Mid-Atlantic

Southeast

Midwest

Mountain Plains and West

Total

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

City

Suburb

Rural*

Small

52

156

195

61

52

144

48

85

232

91

67

226

1410

Medium

14

18

13

12

4

7

11

5

10

15

4

15

129

Total

166

285

311

190

121

298

146

145

383

304

151

500

3000

*Rural includes towns, both of which are defined relative to distance from urban areas.


B.1.4 Prior Data Collection


Except for the Heritage Health Information Survey (HHIS) 2014 survey, which included sampling of museums, IMLS has not collected data from this universe of respondents.


B.2. Procedures for the Collection of Information


There are three components of the data that will be collected:

  • A survey of a stratified random sample of professionals at 3,000 museums;

  • Interviews with up to 22 individuals; and

  • Two virtual focus groups with an expected 14 participants total.


B.2.1. Data Collection


Survey

The survey instrument in Appendix B was reviewed by the Steering Committee, which made suggestions about clarifying skip logic. The Steering Committee was very positive about the survey, as shown, with members citing its structure (i.e., the three mains sections with institutional characteristics followed by a self-assessment of capacity building and concluding with experience with capacity building) as a strength.


After receiving Office of Management and Budget (OMB) approval, PPG will prepare to administer the survey. Preparation for administration includes:

  • Finalizing the survey into the SoGoSurvey platform;

  • Drawing a simple stratified random sample according to Table 4. For each selected institution, PPG will finalize the contact information from the Official Museum Directory4, including contact information for up to two individuals within each museum who can be contacted to complete the survey;

  • Conducting final preparations for storing, cleaning, and analyzing data, including any necessary coding protocol for qualitative analysis;

  • Following a communication schedule with the sample of museums (see Part A.16.) including an initial e-mail and up to four reminders and a thank-you;

  • Requesting assistance from museum associations in collaboration with IMLS, such as American Alliance of Museums (AAM), American Association for State and Local History (AASLH), Association of Science-Technology Centers (ASTC), Association of Art Museum Directors (AAMD), the Association of Children’s Museums (ACM), the Association of Science Museum Directors (ASMD), the Association of Zoos & Aquariums (AZA), and the Coalition of State Museum Associations (COSMA), and the six regional museum associations, so they can encourage their membership to participate in the survey;

  • Administering the survey and tracking participation;

  • Compiling survey data and performing preliminary analysis;

  • Conferring with Steering Committee and Subject Matter Expert Committee formed for this study as outlined in Part A.2. to guide further data analysis and the subsequent findings report; and

  • Developing final recommendations and a report incorporating survey analysis and findings.


Survey Design/Data Collection Tools

The survey data will be collected via SoGoSurvey, which minimizes user technology requirements for both hardware and software, so respondents can participate via personal computers, laptops, tablets, or smart phones. SoGoSurvey prioritizes security to ensure participant data remains confidential and protected. Additionally, SoGoSurvey helps to minimize response burden, restricts multiple survey responses, ensures timely submissions, and provides necessary paradata for tracking and subsequent analyses associated with response propensity. To minimize response burden, PPG will instruct survey takers to respond by selecting the appropriate response of “Don’t know,” “Not applicable,” or “Prefer not to answer” response options (as appropriate). PPG will administer data collection and will be responsible for data cleaning and editing, data analysis, and report writing.


Interviews

Up to 22 interviewees will be individuals at foundations/funders of capacity building for non-profit organizations, in general, a sector that includes museums. Interviews of approximately 60 minutes in length will be conducted via phone or Zoom video conference. Highly trained and skilled interviewers will conduct the interviews. During the phone interviews, notes will be taken. Additionally, a digital recording will be made to ensure all interviewee contributions are synthesized verbatim for further analysis. Appendix A shows the questions that will guide the interviews, with interviewers striving for a conversational experience. PPG’s formally trained interviewers will ask probing questions to clarify interpretation of interviewee responses and verify that the interviewer properly understood the meaning of the interviewee’s contributions. Names and identifying information will be redacted from the post-interview short narrative summaries. However, PPG will seek permission to use identifying information from the respondents in case they wish to be acknowledged or to have their ideas properly attributed.


Virtual Focus Groups

Up to 20 directors and staff at museums will be contacted to participate in one of two virtual focus groups. A time for each group will be set up that is convenient for the most people. PPG expects that of the 20 people,5 14 people will participate in the two groups, given the usual scheduling challenges. In PPG’s experience, this strategy will yield sufficient participants necessary for the synergy of thought and ideas that are the hallmark of the focus group method. Consistent with research by Ochieng (2018),6 the virtual focus group sessions will be 90 minutes in length.


During the focus group, a skilled facilitator who understands the findings from the survey and the interviews will guide the group to add more details and garner more ideas about how to build capacity, especially for small- and medium-sized museums. Participants will be asked at the end of the virtual focus group whether they wish to have their names and other identifying information redacted or preserved in report materials, including an acknowledgement statement for the report.


The documented advantages of the focus group methodology include providing the opportunity to build on the group dynamics to explore the issues in context, depth and detail, freely without imposing a conceptual framework compared with a structured individual interview.7 In this particular case, engaging directors and staff from different types of museums in a facilitated conversation about the survey findings will be valuable in shedding light on the motivations and barriers that drive participation in capacity building as well as a more nuanced understanding of capacity-building participants’ ability to adopt, sustain, and evaluate their capacity-building initiatives.

PPG will conduct the virtual focus groups using Zoom video conference or via phone. A skilled PPG facilitator with extensive focus group experience will run the groups and manage the conversation, while a separate PPG consultant with training in qualitative data collection will take notes. Focus groups will not be recorded. Based on PPG’s extensive experience with focus groups, participants appear to open up a bit more if they know they are not being recorded. PPG consultants have been trained in third-party facilitation and will not guide or lead the conversation in a particular direction, but stick to the thoughtfully constructed interview guide, asking probing questions as needed.


Contacting Respondents, Interviewees and Virtual Focus Group Participants

See Appendix C for sample communications.


  • Survey Communications

Because of the large sample, and to relieve IMLS of administrative responsibility, PPG will issue all communications to survey participants. PPG is the primary point of contact for questions about the questionnaire, and its contact information is enclosed herein. Survey respondents will receive an introductory email, a second email with a unique link to the SoGoSurvey system, and up to four reminder emails. Only participants who have not completed the survey will receive reminders.

  • Association/Network Reminder Emails: IMLS and PPG will reach out to regional, state, and local association/network organizations as needed to encourage participation. See Part B.3.1 for a list of potential association/network organizations.

  • Reminder Phone Calls: PPG will reach out to non-responders via telephone to encourage participation as needed. No museum will be contacted more than once via telephone unless they specifically request additional communication.

  • Thank You Email: All respondents will receive a thank you note from PPG and will be sent a link to the final report when it is available on the IMLS website.


  • Interview and Virtual Focus Group Communications

PPG and IMLS will partner to recruit interview and focus group participants. IMLS will make primary contacts, as its position as a federal agency may encourage participation. PPG may follow-up with non-respondents, schedule, and answer any questions participants may have. All interviewees and virtual focus group participants will receive a copy of key findings from the survey in advance of their interview to which they can refer.


B.2.2. Statistical Methodology


PPG will summarize survey data by calculating descriptive statistics and coding qualitative responses, which will be organized and presented in the report. IMLS and PPG will collaborate on data presentations for results reporting. A series of analyses will be performed to produce descriptive statistics, using appropriate statistical analyses such as regression analysis, analysis of variance and, depending on quality, parametric and non-parametric tests. Each case will have a sampling weight based on likelihood of selection and response propensity. Analyses will be weighted appropriately to represent the universe.


Most questions in the survey are fixed-choice items, but there are three open-ended items and ten items allowing respondents to supply details about an “other” category associated with the question stem. There are also four questions asking the respondent to fill in a number for number of staff. All “other” category and open-response questions will be thematically analyzed. Any quotes from these open-ended responses will be redacted to remove any identifying information if they are to be used to illustrate points in the report.


PPG will also thematically analyze all qualitative data from the interviews and focus groups. This includes the digital recordings and interviewer notes from the individual interviews8 as well as the note-taker notes from the virtual focus groups. All qualitative data will be analyzed using NViVO qualitative analysis software to identify themes and then displayed in tables and graphs with special consideration to protecting respondent confidentiality. The draft and final reports will include redaction of any identifying information.


B.3. Methods to Maximize Response Rates and to Deal with Non-Response


B.3.1. Maximizing Response Rates


IMLS and PPG will seek to maximize response rates using the tactics below, all of which are expected to facilitate higher levels of response in the face of the general decline over the past 20 years of response rates (National Research Council 2013).9 See A.1.6 for a complete communication schedule. Sample communications are provided in Appendix C.


Clear Communications to Survey Respondents, Interviewees and Virtual Focus Group Participants

PPG will carefully plan its communications, including the initial email, email with the survey link, and reminder emails. IMLS and PPG will ensure communications are clear and concise and that adequate information is given to each respondent for their full participation.


Reminders from PPG

As noted, PPG will track who has and has not completed the survey via the SoGoSurvey platform and send up to four reminders to those who have not yet completed the survey at one-week intervals from the initial date the link is shared.10 If necessary, PPG will also call those who have not completed the survey. Sample communications for both reminder emails and phone calls can be found in Appendix C.


Email reminders from Associations/Networks

The survey response rate will likely be enhanced by collaboration with museum associations/networks. This includes associations such as the American Alliance of Museums (AAM), American Association for State and Local History (AASLH), Association of Science-Technology Centers (ASTC), Association of Art Museum Directors (AAMD), the Association of Children’s Museums (ACM), the Association of Science Museum Directors (ASMD), the Association of Zoos & Aquariums (AZA), and the Coalition of State Museum Associations (COSMA), and the six regional museum associations. IMLS will reach out to these associations, as needed, to inform and encourage museums to participate in the survey. To maintain respondent confidentiality, museum associations/networks will not be informed of which respondents complete the survey.


B.3.2. Approaches to Non-responders


A unique identifier will be used to track respondents so that only non-respondents will receive subsequent reminders. Starting one week after the initial survey link is sent, there will be a maximum of four e-mailed reminders, each sent about one week apart, until the survey is closed (the survey will be open for about a month). Additionally, should it be necessary to do so, PPG will make follow-up phone calls to non-responders during the third week of the survey period.


B.3.3. Assessment of Non-Response Bias


Survey nonresponse has been the subject of much robust research (for a comprehensive review, see National Research Council 2013). PPG will run a series of tests to account for time response bias including comparing scores of the early responders to those of the late responders to understand if the responses are the same or if they include a time response bias. This will be conducted through t-tests comparing the two groups on the key dependent variables outlined in B.1.3 (region, income [as a proxy for institutional size], type of museum, and place types [e.g., rural, suburban, city]) and the response propensity hypotheses such as:

  • Respondents are more likely to be prior participants in IMLS’s MAP or CAP programs (please see questions 27 through 30 in the survey in Appendix B).

  • Response propensity will follow a U-shaped distribution as a function of museums’ size (please see dependent variable in B.1.3.)

PPG will also analyze the likelihood of response based on the four stratification variables outlined in B.1.3 including region, income, type of museum, and place type to ensure that the sample is representative of the universe of U.S. museums. PPG will address factors with smaller than desired numbers by conducting “bootstrap analyses” by running data through an analytic set multiple times (approximately 1,000), looking each time at the likelihood that a significant effect for the factor would occur based on the data collected.


B.4. Steps to Minimize Burden and to Improve Utility


IMLS and PPG considered how to minimize the burden on study participants while improving the utility of the information.


Survey (See Appendix B)

First, IMLS and PPG have taken careful consideration in the design of the survey questions by grouping similar items into thematic categories and pre-testing the survey with Steering Committee members (Part A 8.2.) to identify potentially confusing questions or other issues that may impact the reliability and validity of the data. As such, the survey has been streamlined to include only necessary survey items and minimizing the number of open-ended questions, relying instead on closed-ended questions. The use of options like “don’t know” or “not applicable” responses reduces cognitive burden of the survey (for a review, see Krosnick and Presser 2009).11 Several questionnaire items were directly taken from other IMLS surveys; therefore, there are known response profiles, including those for different types and sizes of cultural institutions. PPG has included skip logic where appropriate to ensure survey takers are only exposed to necessary and relevant questions. Based on the SoGoSurvey calculation, the survey should take 16 minutes to complete. Because of the thought and reflection required to complete some of the survey items, PPG estimates an average response time to be 20 minutes.12


This survey instrument was reviewed by the project Steering Committee to ensure the utility of survey questions and clarity of framing. Additionally, other surveys about similar topics with which IMLS is familiar were used as a means of determining how best to pose establishment-level questions in a way that would minimize respondent cognitive burden and increase response reliability and validity. IMLS administers the Public Libraries Survey annually and implemented the 2014 Heritage Health Information Survey in collaboration with Heritage Preservation and RMC Research Corporation. These other instruments contributed several items to the survey. Finally, PPG used a statistically validated survey tool, the Core Capacity Assessment Tool (https://www.tccgrp.com/resource/ccat/), taken by more than 6,000 nonprofits to inform the design of many questions in the Organizational Capacity Assessment section of the survey.


Third, there are no irrelevant questions on the survey, all of which will be used in the survey analysis. All survey questions are therefore germane to the study, particularly Questions 9 through 13. As there have been no previous studies explicitly surveying museums’ perceptions of capacity, this research will be shared with the wider museum and nonprofit field, building on previous studies of capacity building.13 In addition, these data can be leveraged alongside of TCC Group’s CCAT14 database of 6,000 nonprofits to better understand the state of nonprofit capacity building. This will ensure that no questions are unnecessary or irrelevant to the analysis.


Interviews and Virtual Focus Groups (See Appendix A for Guiding Questions)

IMLS and PPG will take a variety of steps to minimize burden on interview and focus groups participants. First, we will leverage secondary data from IMLS, the American Alliance of Museums, the Official Museum Directory, and others to minimize the need for redundant questions. Second, PPG will vet interview protocols with IMLS staff and confer with Steering Committee members within the context of the survey findings to ensure relevancy of questions. All interviews and virtual focus groups will take place after the initial analysis of the survey data to build on the analysis and clarify findings. Finally, all interview and virtual focus group protocols are semi-structured discussions to allow for flexibility and ensure relevancy of conversation to the participants.


B.5. Individuals Responsible for Study Design and Performance


The following individuals are responsible for the study design and data collection and analysis.


Person

Address

Email / Phone

Institute of Museum and Library Services

Christopher J. Reich

Chief Administrator, Office of Museum Services

955 L’Enfant Plaza North, SW
Suite 4000
Washington, DC 20024-2135


[email protected]

202-653-4685

Institute of Museum and Library Services

Lisa Frehill

Senior Statistician

955 L’Enfant Plaza North, SW
Suite 4000
Washington, DC 20024-2135


[email protected]

202-653-4649

Partners for Public Good

Tim Hausmann
Associate Consultant

333 7th Avenue

9th Floor

New York, NY 10001

[email protected]

646-214-0516

Partners for Public Good

Samantha Hackney
Consultant

333 7th Avenue

9th Floor

New York, NY 10001

[email protected]

212-949-0941



1 According to recent research, even a response rate below 10% is not uncommon for web surveys (Conrad, Couper, Tourangeau, & Peytchev, 2010; Fricker, 2008; Heerwegh, Vanhove, Loosveldt, & Matthijs, 2004; Muñoz-Leiva, Sánchez-Fernández, Montoro-Ríos, & Ibáñez-Zapata, 2010; Porter, 2004; Smyth & Pearson, 2011). Several meta-analyses reveal, for example, that web surveys generally get a 6 to 15% lower response rate compared to other survey modes (Fan & Yan, 2010; Smyth & Pearson, 2011; Vehovar & Lozar Manfreda, 2008).

2 National Research Council. (2013). Nonresponse in Social Science Surveys: A Research Agenda. Washington, DC: National Academies Press.

3 NCES Rural Code includes Towns, both of which are defined relative to distance from urban areas. https://nces.ed.gov/surveys/ruraled/definitions.asp.

4 Official Museum Directory (OMD) available from: http://www.officialmuseumdirectory.com/OMD/home.

5 According to (Krueger, 1994), a group of ten participants is considered large enough to gain a variety of perspectives and small enough not to become disorderly or fragmented. With more than 12 members, the group becomes difficult to manage. (Krueger, R. A. (1994). Focus groups: A practical guide for applied research. Thousand Oaks, CA: Sage Publications Inc.)

6 Ochieng, NT, Wilson, K, Derrick, CJ, Mukherjee, N. “The use of focus group discussion methodology: Insights from two decades of application in conservation.” Methods Ecol Evol. 2018; 9: 20– 32. https://doi.org/10.1111/2041-210X.12860

7 For more information on focus group methodology, see: Krueger, R. A. (1994). Focus groups: A practical guide for applied research. Thousand Oaks, CA: Sage Publications Inc.; Ochieng, NT, Wilson, K, Derrick, CJ, Mukherjee, N. “The use of focus group discussion methodology: Insights from two decades of application in conservation.” Methods Ecol Evol. 2018; 9: 20– 32. https://doi.org/10.1111/2041-210X.12860; and Morgan, D.L., Krueger, R.A., & King, J.A. (1998). The focus group kit (Vols. 1–6). Thousand Oaks, CA: Sage Publications Inc.

8 See, for example: Young, JC, Rose, DC, Mumby, HS, et al. A methodological guide to using and reporting on interviews in conservation science research. Methods Ecol Evol. 2018; 9: 10– 19. https://doi.org/10.1111/2041-210X.12828

9 National Research Council. (2013). Nonresponse in Social Science Surveys: A Research Agenda. Washington, DC: National Academies Press.

10 Recent literature demonstrates an increase in response rates alongside the number of reminders (e.g., Fan & Yan, 2010; Sheehan, 2001), with, however, a maximum of three or four messages (Muñoz-Leiva et al., 2010).

11 Krosnick, J. A. and S. Presser. (2009). “Question and Questionnaire Design” draft (15 February 2009) chapter for J.D. Wright and P.V. Marsden. Handbook of Survey Research (2nd edition). San Diego, CA: Elsevier.

12 Recent research demonstrates (Peytchev (2009)) the importance of providing respondents with an adequate estimation of the time they will need to complete the survey.

13 See: Connolly, Paul and Carol Lukas. (2004). Strengthening Nonprofit Performance: A Funder’s Guide to Capacity Building. (2nd Edition). Saint Paul, MN: Wilder Publishing Center; Brothers, John and Anne Sherman. (2012) Building Nonprofit Capacity: A Guide to Managing Change through Organizational Lifecycles. (San Francisco, CA): Jossey-Bass: A Wiley Imprint; and Foundation Center. (2015). Supporting Grantee Capacity: Strengthening Effectiveness Together. Grantcraft. Available from: https://grantcraft.org/content/guides/supporting-grantee-capacity/#highlights.

14 The Core Capacity Assessment Tool (CCAT) is an online, survey-based tool designed to collect information from key decision-makers in your organization and create prioritized recommendations for building organizational capacity. It has been used by nonprofits more than 6,000 times, the CCAT is a leading assessment tool for measuring a nonprofit’s effectiveness.

IMLS – PPG: Supporting Statement B | 9

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy