Justification B School Readiness Project

Justification B School Readiness Project.pdf

Building a National Network of Museums and Libraries for School Readiness Project (SRP)

OMB: 3137-0122

Document [pdf]
Download: pdf | pdf
Building a National Network of Museums and Libraries
for School Readiness Project (SRP)
Section B. Description of Statistical Methodology
Overview

The Education Development Center will conduct an evaluation of About the Building a National Network
of Museums and Libraries for School Readiness (SRP) in order to document project progress and to
identify factors and processes that are key to establishing and sustaining these networks in six states, as
well as to inform the scale-up of networks to all 50 states. The following goals will guide the evaluation:
• Goal 1. Identify institutional capacities and cross-organizational relationships that support
model outreach, implementation, and sustainability in order to understand elements and
processes that are central to forming, sustaining, and scaling-up the network model in all states.
• Goal 2 Identify the ways in which the network model prepares and supports hub leaders, key
partners, collaborating organizations, and families in promoting academic readiness among
young children.
• Goal 3. Document project activities and implementation of the network model to ensure that
the project is on schedule and that activities are being implemented as intended by IMLS and
BCM.
The following evaluation questions will guide this work:
• EQ1: What resources, institutional structures, and cross-organizational relationships support the
successful implementation of the existing network model? (Goal 1)
• EQ2: How do hub leaders, key partners, and collaborating organizations implement the network
model? In what ways do they adapt the model to fit their individual contexts and needs, and
what successes and challenges do they experience? (Goal 1)
• EQ3: How do hub leaders, key partners, and collaborating organizations reach families with
informal learning opportunities, especially those not currently using museums and libraries?
What are the barriers for accessing museums and libraries? (Goal 1)
• EQ4: What strategies and activities do hub leaders, key partners, and collaborating organizations
view as optimal to sustaining existing networks and exponentially growing and adapting the
network model to all 50 states? What are some key challenges including internal and external
factors that will make it difficult for the current model to sustain and grow? (Goal 1)
• EQ5: What do hub leaders, key partners, and collaborating organizations view as key factors for
school readiness, and what aspects of the network model do they see as supporting their
institution’s capacities for supporting school readiness? (Goal 2)
• EQ6: In what ways, if any, do families view organizations within state networks as supporting
their young children's school readiness? (Goal 2)
• EQ7. To what extent is the project on schedule and are activities being implemented as
intended? (Goal 3)
To address these questions EDC will use a mixed-methods design, pairing quantitative survey data with
qualitative interview data.

B.1 Respondent Universe
The program model for the Building a National Network of Museums and Libraries for School Readiness
(SRP) will be comprised of six state networks. Each network will include: (1) hub leaders (children’s
museum or library that serves as the leader of the hub network); (2) key partners (organizations that
hub leaders currently partner with); and (3) collaborating organizations (new partner organizations that
result from this project). Part of the project work for Building a National Network of Museums and
Libraries for School Readiness (SRP) is recruiting new organizations to participate in both existing and
new state networks.
As shown in Table 6, each state network falls into one of three cohorts. Cohort 1 (Massachusetts) and
Cohort 2 (Virginia and South Carolina) were established prior to this grant. Cohort 3 consists of the three
states networks (Iowa, Mississippi, and New Mexico) that will be established through this grant. During
Year 1 of this three year grant, Boston Children’s Museum (BCM) will recruit and onboard hub leader
and partner organizations for state networks in Cohort 3, as well new organizations for state networks in
Cohort 1 and Cohort 2. Thus, in Year 1, the evaluation team will collect data from organizations in state
networks that are currently part of the state networks in Cohort 1 and Cohort 2. In Year 2 and Year 3,
the evaluation team will collect data from organizations in state networks from all cohorts.
Table 1. Timeline of State Network Rollout and Evaluation Activities

Cohort 1
Cohort 2

Cohort 3

State

Year
Established

Data collection
timeline

Massachusetts

2016

Year 1 – Year 3

Virginia

2018

Year 1 – Year 3

South Carolina

2018

Year 1 – Year 3

Iowa

2020-2021

Year 2 – Year 3

Mississippi

2020-2021

Year 2 – Year 3

New Mexico

2020-2021

Year 2 – Year 3

The sample sizes we report here are based on an estimate of 40 total organizations spread across the six
states networks. We estimate that each state network will include at least one hub leader organization,
one partner organization, and one collaborating organization. Finally, we anticipate a respondent
universe of families that visit and/or participate in programs at the organizations; however, because we
do not yet know all of the organizations that will be participating, it is impossible to estimate that total
possible universe of families. Across this population, EDC will complete the data collection activities
below. Table 7 summarizes each data collection activity.
o Year 1
o Document review of reports and documentation from the previous grants that
supported the SRP network model
o Interview staff lead at each of the three hub leader sites
o Survey staff lead at each of three hub leader sites and one staff lead at each of three key
partner sites
o Year 2
o Interview subset of staff leads (n=8) from hub leader sites, a subset of staff leads (n=6)
from key partner sites, and a subset of staff leads (n=6) from collaborating sites.
o Survey staff leads at all hub leader sites, all key partner sites, and all collaborating sites.

o

o Year 3
o
o
o

We do not yet know the final number of sites, but we estimate it will be about 40.
Conduct focus groups (n=2) During the Year 2 national meeting, EDC will provide a focus
group training to hub and partner organizations, who will conduct their own focus
groups. Across six states, we anticipate there will be a total of 30 focus groups per year
(60 total), with approximately 8 participants per focus group.
Interview subset of staff leads (n=8) from hub leader sites, a subset of staff leads from
key partner sites (n=6), and a subset of staff leads from collaborating sites (n=6)
Survey staff leads at all hub leader sites, all key partner sites, and all collaborating sites.
We do not yet know the final number of sites, but we estimate it will be about 40.
Conduct focus groups (n=2 focus groups; 16 participants in total) During the Year 2
national meeting, EDC will provide a focus group training to hub and partner
organizations, who will conduct their own focus groups. Across six states, we anticipate
there will be a total of 30 focus groups per year (n=60 focus groups total), with
approximately 8 participants per focus group (n=480 participants in total).

Table 2. Summary of Data Collection Activities
Eval
Question
(Goal)

Method

Document
review
EQ1
(Goal 1)

Interview I
Interview II
Survey I
Interview I

EQ2
(Goal 1)

Interview II
Survey I
Survey II**

EQ3
(Goal 1)
EQ4
(Goal 1)

Interview I
Interview II
Interview I
Interview II
Interview I

EQ5
(Goal 2)

Interview II
Survey I
Survey II**

EQ6
(Goal 2)
EQ7
(Goal 3)

Focus
group***

Method

Participant Group(s)*

Date of Data
Collection

Corresponding
Question(s) from
instruments**

Review of reports and
documentation from
previous grants

n/a

Year 1

n/a

Staff from hub leader organizations

Year 1 – Year 3

Q4 - Q6; Q21

Staff from key partner/collaborating
organizations

Year 2 – Year 3

Q4 -Q6; Q20

Staff from hub leader and key
partner organizations

Year 1

Q11

Staff from hub leader organizations

Year 1 – Year 3

Video conferencing app
(e.g., Zoom)
Web-based survey tool
(e.g., Qualtrics)
Video conferencing app
(e.g., Zoom)

Web-based survey tool
(e.g., Qualtrics)
Video conferencing app
(e.g., Zoom)
Video conferencing app
(e.g., Zoom)

Video conferencing app
(e.g., Zoom)

Web-based survey tool
(e.g., Qualtrics)

In-person

Staff from key partner/collaborating
organizations
Staff from hub leader and key
partner organizations
Staff from hub leader and key
partner/collaborating organizations

Year 2 – Year 3

Q2 - Q5; Q12;
Q14; Q18
Q2; Q4 - Q5;
Q12; Q14; Q17

Year 1

Q10 - Q14; Q19

Year 2 – Year 3

Q10 - Q14; Q19

Staff from hub leader organizations

Year 1 – Year 3

Q9; Q11; Q15,

Staff from key partner/collaborating
organizations

Year 2 – Year 3

Q9; Q11; Q15

Staff from hub leader organizations

Year 1 – Year 3

Q4 - Q5; Q19

Staff from key partner/collaborating
organizations

Year 2 – Year 3

Q4 - Q5; Q19

Staff from hub leader organizations

Year 1 – Year 3

Staff from key partner/collaborating
organizations
Staff from hub leader and key
partner organizations
Staff from hub leader and key
partner/collaborating organizations
Adult from family participating in
SRP through hub and partner
organizations
n/a****

Year 2 – Year 3

Q6; Q8 - Q9;
Q11; Q13; Q16
Q6 - Q7; Q11;
Q13; Q16

Year 1

Q15 - Q18

Year 2 – Year 3

Q15 - Q18

Year 2 – Year 3

All

*Hub refers to the statewide partnerships between and across museums, libraries, community organizations, and early care and educator provider
networks. Hub leaders are the children’s museum or library that serves as leader of the hub. Key partners are organizations that hub leaders are
currently partnering with. Collaborating organizations are new key partner organizations that join the hub as a result of this project
**Survey II will include items from Survey I, along with additional items we develop as a result of findings from Year 1 data collection. For the purposes of
this table, Survey II question #’s in the last column refer to question #’s from Survey I
***The EDC evaluation team will conduct four focus groups (two in Year 2; two in Year 3). During the Year 2 national meeting, EDC will provide a focus
group training to hub and partner organizations, who will conduct their own focus groups. Across six states, we anticipate there will be a total of 30 focus
groups per year (60 total), with approximately 8 participants per focus group.
****EDC will address EQ7 by documenting BCM’s progress in carrying out the project activities. Note that there are no data collection activities
associated with EQ7; rather, EDC will address this evaluation questions through updates from BCM via email correspondence

B.2. Potential Respondent Sampling and Selection Methods
In order to identify institutional capacities and cross-organizational relationships that support successful
model implementation and to identify the ways in which the network model prepares and supports
organizations and families in promoting academic readiness, we will conduct annual surveys with a staff
leads from ALL participating organizations (i.e., the entire universe of respondents). The universe of
Year 1 respondents will include organizations in the state networks that are part of Cohort 1 and Cohort
2 (see Table 5). Organizations that are part of the Cohort 3 state networks will be onboarded by Boston
Children’s Museum at the end of Year 1, and therefore will not part of the Year 1 respondent universe.
The universe of respondents in Year 2 and Year 3 will include organizations from all state networks
across all cohorts. We will survey the same staff lead each year (assuming the staff lead has not left the
organization or changed roles). Since we will be surveying all staff leads in all participating organizations,
sampling is unnecessary.
To capture variation in model implementation and experiences across state networks, local contexts,
and program levels, each year we also will conduct semi-structured interviews with staff leads from a
subset of hub leader, key partner, and collaborating organizations across the state networks. The sample
frame for the Year 1 interviews (n=3) will consist of the staff leads at hub leader organizations in the
existing state networks (see Cohort 1 and Cohort 2 in Table 5). In Year 2 and Year 3, the sampling frame
for the interviews will consist of the staff leads from hub leader organizations (n=8 per year; 16 total),
key partner organizations (n=6 per year; 12 total), and collaborating organizations (n=6 per year; 12
total) across all state networks and cohorts. In Year 2 and Year 3, focus groups will be conducted (n=30
focus groups per year; 60 focus groups total) with a subset of families (n=240 families per year; n=480
total). The sample frame will include families who engage with hub leader, key partner, and
collaborating organizations across all states and cohorts. Note that the EDC evaluation team will conduct
four of the focus groups (two in Year 2; two in Year 3). During the Year 2 national meeting, EDC will
provide a focus group training to hub, partner, and collaborating organizations, who will conduct their
own focus groups. Across six states, we anticipate there will be a total of 30 focus groups per year (60
total), with approximately 8 participants per focus group. To select the subsets for the interviews and
focus groups, we will employ purposive sampling, specifically maximum variation sampling. 1 This
sampling approach allows us to maximize the diversity of responses and learn about implementation
1

Teddlie, C., & Yu, F. (2007). Mixed methods sampling: A typology with examples. Journal of mixed methods research, 1(1), 77100.

across a heterogenous group of settings. Boston Children’s Museum (BCM) worked with IMLS to identify
a new cohort of states to implement the network model. BCM and IMLS sought states with diversity
related to geography, community type (urban, rural, tribal) and populations served (dual language
households). EDC will select the sub-sample for interviews and focus groups based on these three
characteristics, making sure the final sub-sample is representative of this diversity. We will make every
effort to ensure that our interview sample is representative of these characteristics; however, to
account for the possibility of selection bias, we will compare the characteristics of any organization that
opted not to participate in interviews with the characteristics of the organizations that did. All analysis
and reporting will document the extent and nature of these differences.

B.3. Response Rates and Non-Responses

We anticipate high response rates across all data collection activities (between 85% and 100%) given the
close working relationship between Boston Children’s Museum and the participating organizations and
the small number of respondents. Based on previous work, To reach these response rates, we will follow
recommendations from the literature. 2,3 For example, to foster increased participation, Boston
Children’s Museum and EDC will provide participating organizations with a detailed overview of
evaluation activities at the yearly meetings, establish strong channels of communication, and provide
adequate notification and time to complete each data collection activity. We recognize that missing data
can undermine the findings of an evaluation. If the response rates to the survey fall below 80%
(response rate threshold recommended by OMB), we will conduct missing data analysis to examine if
the data are missing at random or if there are differences in the characteristics between organizations
that responded and those that did not respond. If we find that there are differences and that the data
are not missing at random, we will select the appropriate procedures for handling missing (e.g.,
weighting).

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys: the tailored design
method. John Wiley & Sons.
2

3

Monroe, M. C., & Adams, D. C. (2012). Increasing response rates to web-based surveys. Journal of Extension, 50(6), 6-7.

B.4. Tests of Procedures and Methods

In developing semi-structured instruments for the interviews, focus groups, and Year 1 survey, we drew
on and adapted items from existing instruments from current and previous work, creating new items as
necessary. In developing the survey, we included the required performance measure items from IMLS. 4
Furthermore, we drew from literature related to emergence, 5 social innovation, 6,7 social network
analysis 8,9 and social-emotional learning 10,11. EDC will use findings that emerge from the Year 1
interviews and survey to refine and revise the survey for Years 2 and 3. For example, we will likely do an
analysis of open-ended items from the Year 1 survey to develop close-ended items for the revised
survey.
Data analysis
Analysis of quantitative data. We will use statistical software (such as STATA) to conduct descriptive
analyses of close-ended survey items. After data have been cleaned, researchers will calculate means
and standard deviations for continuous measures and frequency tables for discrete measures.
Analysis of qualitative data. Data from the document review, interviews, and focus groups will be
transcribed and analyzed using qualitive coding software (such as Dedoose). We will conduct a content
analysis, which is a systematic analytical technique that is particularly useful for analyzing text data. 12
Given that research on the processes and principles for establishing and sustaining networks across
libraries and museums is limited, we will follow the conventional approach to content analysis. Using
this inductive approach, two researchers will engage in multiple reviews of the data. Through these
initial reviews we will identify overarching themes related to our research questions and generate a
coding scheme that we will apply to the data during a second round of review. To ensure consistency
across coders, we will double-code a subset of data, discussing and resolving differences as necessary.

B.5. Contact Information for Statistical or Design Consultants
EDC
Project Director: Wendy Martin, Research Scientist, [email protected]
Project Lead: Michelle Cerrone, Senior Research Associate, [email protected]
Institute of Museum and Library Services (2019). National Leadership Grants for Museums: FY 2019 Notice of Funding
Opportunity. (IMLS-CLR-D-0024). Retrieved from: https://reginfo.gov/public/do/DownloadDocument?objectID=84159201
5 Wheatley, M., & Frieze, D. (2006). Lifecycle of Emergence–Using Emergence to Take Social Innovation to scale.[online]. Dostupnopreko: http://www. margaretwheatley. com/articles/emergence. html [11. 2. 2010.].
6 Ayob, N., Teasdale, S., & Fagan, K. (2016). How social innovation ‘came to be’: Tracing the evolution of a contested
concept. Journal of Social Policy, 45(4), 635-653.
7 Mulgan, G., Tucker, S., Ali, R., & Sanders, B. (2007). Social innovation: what it is, why it matters and how it can be accelerated.
8 Carrington, P. J., Scott, J., & Wasserman, S. (Eds). (2005). Models and methods in social network analysis (Vol. 28). Cambridge
university press.
9 Freeman, L. (2004). The development of social network analysis. A Study in the Sociology of Science, 1, 687.
10 Catalano, R. F., Berglund, M. L., Ryan, J. A., Lonczak, H. S., & Hawkins, J. D. (2004). Positive youth development in the United
States: Research findings on evaluations of positive youth development programs. The annals of the American academy of
political and social science, 591(1), 98-124.
11 Collaborative for Academic, Social, and Emotional Learning [CASEL]. (2005). Safe and sound: An educational leader’s guide to
evidence-based social and emotional learning programs – Illinois edition. Chicago, IL.
12 Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative health research, 15(9),
1277-1288.
4

IMLS
Reagan Moore, Program Officer, [email protected]
Marvin Carr, Evaluation Officer, [email protected]


File Typeapplication/pdf
AuthorKim A. Miller
File Modified2020-05-01
File Created2020-05-01

© 2024 OMB.report | Privacy Policy