SRP Justification A_9-24-2020

SRP Justification A_9-24-2020.docx

Building a National Network of Museums and Libraries for School Readiness Project (SRP)

OMB: 3137-0122

Document [docx]
Download: docx | pdf


Section A. Justification


A.1. Necessity of the Information Collection

Project background and overview


The Building a National Network of Museums and Libraries for School Readiness Project (hereafter designated the SRP) is an expression of the mission of the Institute of Museum and Library Services (IMLS) to “advance, support, and empower America's museums, libraries, and related organizations through grantmaking, research, and policy development.” IMLS is funding the Boston Children’s Museum (BCM) a second time in order to scale the SRP into three new states, for a total of six. The SRP’s goal is to amplify the strength of organizations serving early learners and their families by forming networks between and across these organizations. In doing so, this project will prepare museum, library, and other informal early childhood education practitioners to ensure children in all regions, regardless of socio-economic or linguistic background, have the skills needed to enter school prepared for success.


IMLS has leveraged the ability of museums and libraries to promote early learning at the community, state, and national levels and as part of inter-agency initiatives for many years. This project strongly aligns IMLS’s legacy of supporting early learning with one of the goals in its current strategic plan, Transforming Communities: building the capacity of museums and libraries to improve the well-being of their communities.


Promoting lifelong learning has been and continues to be one of IMLS’s strategic plan goals, and this project supports that goal. It is our objective to enable and empower museums and libraries to facilitate development of various literacies, including early childhood literacies, and to provide resources and tools to families and childcare givers to nurture these literacies in early learners. IMLS has supported early childhood initiatives, research studies, and publications involving within-sector and cross-sector partnerships for more than a decade. As museums and libraries become more active in these types of community efforts, IMLS will continue to support them as they drive positive change using informal learning experience and professional development techniques like the “train the trainer” approaches used in SRP.


Under the leadership of children's museums and libraries, these statewide partnerships help to forge connections and strengthen existing networks of museums, libraries, community organizations, and early care and education provider networks; build professional capacity for implementing high-quality informal learning experiences for children across the state; and foster family engagement and learning, especially among hard-to-reach and underserved families.


As displayed in Table 1, there were existing networks in three states prior to this cooperative agreement: Massachusetts (established in 2016) and Virginia and South Carolina (established in 2018). Over the three years of the SRP, Boston Children’s Museum will complete the following activities:

  • maintain and continuously improve the current existing network in Massachusetts;

  • scale existing efforts in South Carolina and Virginia;

  • pilot new networks in Iowa, Mississippi, and New Mexico in collaboration with BUILD, a national Initiative supporting state leaders' efforts to develop a comprehensive early childhood system tailored to the needs of their state's young children; and

  • develop sustainability mechanisms for the network within and among these six states.

Table 1. State Network Rollout Timeline


State

Year

Established

Cohort 1

Massachusetts

2016

Cohort 2

Virginia

2018

South Carolina

2018

Cohort 3

Iowa

2020

Mississippi

2020

New Mexico

2020













This project’s vision is that, as more states become involved in this SRP, national meetings and regular communication will shape a nationwide coalition of skilled informal educators from organizations such as museums, libraries, and community-based after-school programs whose goal it is to prepare children and families for school. By the end of three years, there will be six statewide networks in sustaining mode nationwide.


Note that throughout this document, we refer to four key stakeholder groups that participate in the School Readiness through Partnerships model:

  • Hub leader organizations. The children’s museums or libraries that serve as the leaders of networks within states

  • Key partner organizations. Organizations that are currently partnering with hub leaders as a result of previous grants

  • Collaborating organizations. Organizations that will join partnerships with hub leaders and key partners as a result of this current grant

  • Families. Families with young children who participate in activities through hub leader, key partner, and/or collaborating organizations

External Evaluation

The Education Development Center (EDC) will serve as the third-party evaluator for the project, documenting the progress and understanding the process of how the School Readiness through Partnerships model builds institutions’ readiness to serve all families in their regions, and ultimately gaining insights into what is necessary to support scale-up of these networks across the country. The proposed evaluation is budgeted at $68,699. The proposed evaluation will accomplish the three key goals below.

  • Goal 1. Identify institutional capacities and cross-organizational relationships that support model outreach, implementation, and sustainability in order to understand elements and processes that are central to forming, sustaining, and scaling-up the network model to all states.

  • Goal 2. Identify the ways in which the network model prepares and supports hub leaders, key partners, collaborating organizations, and families in promoting academic readiness among young children.

  • Goal 3. Document project activities and implementation of the network model to ensure that the project is on schedule and that activities are being implemented as intended by IMLS and BCM.

This evaluation does NOT seek to explore or come to any conclusions about the extent to which the SRP impacts children.


About IMLS

IMLS is the primary source of federal support for the nation's libraries and museums. It advances, supports, and empowers America’s museums, libraries, and related organizations through grant making, research, and policy development. IMLS envisions a nation where museums and libraries work together to transform the lives of individuals and communities.


IMLS conducts data collection, analysis, and evaluation of its grant programs and field engagement efforts with the overall goal of continuous improvement. This data collection is authorized by 20 U.S.C. § 9108 (Policy research, data collection, analysis and modeling, evaluation, and dissemination). 


About Boston Children’s Museum

Founded in 1913, Boston Children’s Museum is one of the oldest and most influential children’s museums in the world. The Museum’s exhibits and programs emphasize hands-on engagement and learning through experience, employing play as a tool to spark the inherent creativity, curiosity, and imagination of children. Designed for children and families, Museum exhibits focus on science, culture, environmental awareness, health and fitness, and the arts. In addition to extensive child-centered exhibits, Museum educators develop numerous programs and activities that address literacy, performing arts, science and math, visual arts, cultures, and health and wellness. As one of the largest children’s museums in the world, Boston Children’s Museum also provides museum consulting services and creates award winning traveling exhibits, staff training curriculum, and exhibit kits for Museum professionals.

About Education Development Center, Inc.

Education Development Center, Inc. (EDC) is a nonprofit international research and development organization dedicated to improving the quality, effectiveness, and equity of education throughout the United States and in more than 50 other countries. Founded in 1958, the company is acknowledged as a leader in efforts to solve a wide range of educational, health, and social problems and is recognized for the high quality of its training, technical assistance, program and product development, evaluation research, and organizational development.


Prior related studies

The evaluation of BCM’s previous IMLS-funded work focused primarily on assessing the individual products and social networks that resulted from previous grants. Findings from these evaluations informed the refinement of the products and increased understanding of what networks and network connections in the SRP model look like. These evaluations also offered more informal feedback on the experiences of hub leader organizations, which BCM has used to better support participants. However, because the focus of these evaluations was on specific products, there has been no systematic collection or analysis of data that addresses the goals of this particular project. Through the current evaluation, we aim to explore and identify a set processes and principles for establishing and sustaining networks across libraries and museums.


Why is it being scaled?

With the current IMLS National Leadership Grant for Museums grant (MG-20-15-0057-15), BCM has successfully pilot-tested its Massachusetts network model in South Carolina and Virginia. Working with the BUILD Initiative, BCM has conducted webinars to introduce the state teams to the SRP capacity-building process and is designing, prototyping, and evaluating new program ideas, activities, and materials for replication and dissemination. Through the Building a National Network of Museums and Libraries for School Readiness Project, BCM and BUILD will continue to refine the partnership model for expansion to additional state teams in Iowa, Mississippi, and New Mexico as part of this cooperative agreement. Going forward, the BCM materials are intended to inspire partners to develop their own programs and practices.


A.2. Purposes and Uses of the Data

The Education Development Center will conduct an evaluation of the Building a National Network of Museums and Libraries for School Readiness Project (SRP) in order to document project progress and to identify factors and processes that are key to establishing and sustaining these networks in six states, as well as to inform the scale-up of networks to all 50 states. This scale-up will inform whether and how IMLS will support further expansion into more states in the future. The primary goal of this evaluation is formative in nature and will allow the grantees to improve their practices and tools so that the future iteration of this network model might be stronger and informed by lessons learned from the evaluation.


Moreover, the information gathered through this project will be used to improve IMLS’s own practices as it relates to addressing the opportunities and challenges facing children’s museums and libraries in reaching populations who have not historically benefited from the rich informal learning experiences these institutions provide. We will use the evaluation findings to generate illustrative case studies with qualitative data that will, along with the learnings gained through individual work with state teams, inform future scaling up efforts. As with any and all National Leadership Grants for Museums projects, the lessons learned, and final reports will be made publicly available and disseminated through blogs, articles and conference presentations.


The following goals will guide the evaluation:

  • Goal 1. Identify institutional capacities and cross-organizational relationships that support model outreach, implementation, and sustainability in order to understand elements and processes that are central to forming, sustaining, and scaling-up the network model in all states.

  • Goal 2. Identify the ways in which the network model prepares and supports hub leaders, key partners, collaborating organizations, and families in promoting academic readiness among young children.

  • Goal 3. Document project activities and implementation of the network model to ensure that the project is on schedule and that activities are being implemented as intended by IMLS and BCM.

The following evaluation questions will guide this work:

  • EQ1: What resources, institutional structures, and cross-organizational relationships support the successful implementation of the existing network model? (Goal 1)

  • EQ2: How do hub leaders, key partners, and collaborating organizations implement the network model? In what ways do they adapt the model to fit their individual contexts and needs, and what successes and challenges do they experience? (Goal 1)

  • EQ3: How do hub leaders, key partners, and collaborating organizations reach families with informal learning opportunities, especially those not currently using museums and libraries? What are the barriers for accessing museums and libraries? (Goal 1)

  • EQ4: What strategies and activities do hub leaders, key partners, and collaborating organizations view as optimal to sustaining existing networks and exponentially growing and adapting the network model to all 50 states? What are some key challenges including internal and external factors that will make it difficult for the current model to sustain and grow? (Goal 1)

  • EQ5: What do hub leaders, key partners, and collaborating organizations view as key factors for school readiness, and what aspects of the network model do they see as supporting their institutions’ capacities for supporting school readiness? (Goal 2)

  • EQ6: In what ways, if any, do families view organizations within state networks as supporting their young children's school readiness? (Goal 2)

  • EQ7. To what extent is the project on schedule and are activities being implemented as intended? (Goal 3)


To address these questions EDC will use a mixed-methods design, pairing quantitative survey data with qualitative interview data. We summarize data collection methods in Table 2. In addition to the data collection activities in Table 2, EDC will address EQ7 by documenting BCM’s progress in carrying out the project activities. Note that there are no data collection activities associated with EQ7; rather, EDC will address this evaluation question through updates from BCM via email correspondence and status meetings.


Table 2. Summary of Data Collection Activities

Method

Collection method

Timeline

Participant Group(s)

Sample

Evaluation Question

Document review

Review of reports and documentation from previous grants that funded existing network model

Year 1

n/a

n/a

EQ1

Interview

Video conferencing app (e.g., Zoom)

Year 1

Staff from hub leader organizations

3

EQ1, EQ2, EQ3, EQ4, EQ5

Survey

Web-based survey tool (e.g., Qualtrics)

Year 1

Staff from hub leader organizations

All

EQ1, EQ2; EQ5

Staff from key partner organizations

All

Interview

Video conferencing app (e.g., Zoom)

Year 2, Year 3

Staff from hub leader organizations

8 per year;

16 total

EQ2, EQ3, EQ4, EQ5

Staff from key partner organizations

6 per year;

12 total

Staff from collaborating organizations

6 per year;

12 total

Survey

Web-based survey tool (e.g., Qualtrics)

Year 2, Year 3

Staff from hub leader organizations

~40 per year; 80 total*

EQ2, EQ5

Staff from key partner organizations

Staff from collaborating organizations

Focus* group

In-person

Year 2, Year 3

Adult from family participating in the SRP through hub, partner, and collaborating organizations

240 participants per year; 480 total **

EQ6

*The Year 2 and Year 3 survey will be administered to a representative from each organization participating in the network. We do not yet know the final number of participating organizations but estimate it will approximately 40.


**The EDC evaluation team will conduct four focus groups (two in Year 2; two in Year 3). During the Year 2 national meeting, EDC will provide a focus group training to hub and partner organizations, who will conduct their own focus groups. Across six states, we anticipate there will be a total of 30 focus groups per year (60 total), with approximately 8 participants per focus group.


Year 1. The research team will interview staff from current hub leader organizations in Massachusetts, Virginia, and South Carolina (n=3; one per state) to begin to identify resources, institutional structures, and cross-organizational relationships that support implementation and to understand how participants implement the existing model (EQ1, EQ2), strategies they use to reach families (EQ3); ideas for sustaining the network model (EQ4); and how the network supports participating organizations in promoting academic readiness (EQ5). EDC will also work with Boston Children’s Museum to administer a survey to existing hub leader and key partner organizations to understand their current practices around promoting school readiness, particularly through local community engagement and coordination with other partner organizations (EQ1, EQ2), and how participating in the network helps organizations promote academic readiness (EQ5).

Years 2 and 3. The evaluation team will continue to document project activities in Year 2 and Year 3. Additionally, each year EDC will conduct an interview study with subsets of hub leader organizations (n=8 each year; 16 total over two years), key partner organizations (n=6 each year; 12 total over two years) and collaborating organizations (n=6 each year; 12 total over two years). Note that our plan is for the participants in the Year 1 interviews to also participate in the Year 2 and Year 3 interviews (they are included in the estimated sample sizes above). These interviews will be conducted via video conferencing applications (e.g., Zoom) and will allow us to gain in-depth insight into how hub leader, key partner, and collaborating organizations are implementing the network model and the successes and challenges they face (EQ2); strategies they use to reach families (EQ3); ideas for sustaining the network model (EQ4); and the aspects of the network model that help their organization promote academic readiness (EQ5). We will also administer an online survey to all hub leader, key partner, and collaborating organizations (n=~40 per year; ~80 total). The survey will provide higher-level insight into model implementation (EQ2) and will probe what factors organizations view as key to school readiness and which aspects of the model support their organization in promoting school readiness (EQ5). The survey will include performance measures from the IMLS Notice of Funding Opportunities (Goal 2: Build Capacity), the Year 1 survey, and items that the evaluation team develops as a result of findings from the Year 1 survey and interviews. To better understand how families’ experiences with participating organizations (EQ6), the evaluation will include focus groups with families from a subset of hub leader, key partner, and collaborating organizations. The evaluation team will conduct four focus groups (two in Year 2, two in Year 3) and will lead a workshop during the national meeting to train staff from hub leader, key partner, and collaborating organizations on how to conduct their own focus groups. The project team anticipates there will be approximately five focus groups per state (~30 focus groups per year; ~60 in total). Each focus group will have approximately 8 participants (240 participants per year; 480 total).


A.3. Use of Information Technology

As displayed in Table 2, the interviews will be conducted via a videoconferencing app such as Zoom. If the interviewee is unable to access or use a videoconferencing app, we will conduct the interview via telephone. The interviews will be conducted at a time that is convenient for the interviewee. Focus groups will be conducted in-person and scheduled at a time when families are already participating in an activity at a hub, partner, or collaborating organization. If we are unable to conduct focus groups in person due to COVID-19 and/or unforeseeable events, we will use a videoconferencing app such as Zoom to conduct individual interviews with families. Interviews and focus groups will be audio recorded and transcribed. Prior to the interviews and focus groups, the evaluation team will collect consent forms from participants. This will include consent to be audio recorded. The audio recording will be transcribed by a professional transcription service. The audio recordings and transcripts will be shared between the evaluation team and the transcriber using a secure site.


To reduce paperwork, the evaluation team will administer an online survey to staff from hub, partner, and collaborating organizations. The survey will be administered through a web-based survey tool such as Qualtrics survey software. To ease burden and facilitate completion, the survey format will allow respondents to complete the survey using a computer or a smartphone. A link to the survey will be shared via email. Paper copies of the survey will be provided to staff unable to complete the online survey. EDC will work with Boston Children’s Museum to ensure the list of respondents’ email addresses is valid. Prior to beginning the survey, respondents will provide consent to participate forms.


A.4. Efforts to Identify Duplication

This evaluation will generate findings specific to the model of Building a National Network of Museums and Libraries for School Readiness Project (SRP). This is a new data collection effort and the data to be collected are not otherwise available.


A.5. Methods Used to Minimize Burden on Small Businesses

Data collectors will inform all potential participants from all sites that their participation is strictly voluntary. Furthermore, data collectors will inform participants that they may withdraw from the evaluation at any time for any reason with no consequence.


A.6. Consequences of Less Frequent Data Collection

No other evaluation or data collection activities are investigating this effort. If we were not to collect the proposed data or if we were to reduce the frequency of data collection, developers and decisionmakers would miss out on valuable insight into what is necessary to support scale-up of these networks across the country. In order to meet the evaluation goals, we will need to collect data from hub leaders, partner organizations, and collaborating organizations each year of the project when this information is still fresh to staff and families from each organization.


A.7. Special Circumstances

No special circumstances require the collection to be conducted in a manner inconsistent with the guidelines in 5 CFR 1320.6.


A.8. Consultations Outside the Agency

Public comments solicited through Federal Register

IMLS published a Notice of Proposed Information Collection Request for Comments on the Federal Register on November 4, 2019 (FR vol. 84, No. 213, pgs. 59422-59423). Written comments were to be submitted to the Office of Grants Policy and Management, Institute of Museum and Library Services on or before December 31, 2019. One comment was received and acknowledged.


Consultants outside the agency

IMLS has closely consulted with Boston Children’s Museum and the external evaluator, Education Development Center, in the development of the evaluation plan, data collection, and instruments.


A.9. Payments or Gifts to Respondents

There are no payments or gifts to respondents.


A.10. Assurance of Confidentiality

All data collection activities will be submitted for approval by the Institutional Review Board at Education Development Center. Per EDC’s IRB, data collection will strictly follow 45 CFR 46 (Protection of Human Subjects). Any personally identifiable data collected (e.g., respondent name) will be removed prior to analysis.


In order to ensure that individual responses cannot be traced back to respondents in any publications that result from this work, data will be presented in the aggregate. Furthermore, any cells with less than three cases will be suppressed. Qualitative data will be reported thematically. Identifiable data will be removed prior to coding.


A.11. Justification for Sensitive Questions

To understand the backgrounds of staff facilitating the SRP, the survey will include demographic and background questions, such as highest level of education completed. Survey respondents will have the option to skip these questions. Prior to taking the survey, participants will sign a consent form that outlines their rights as research participants, including the options to skip questions and to withdraw from the study at any time with no consequence. The survey will not include any other type of sensitive questions.


A.12. Estimates of Hour Burden to Respondents/Table



Table 3. Estimates of Hour Burden to Respondents


Data

Participant group

Estimated # of respondents

Frequency of response

Estimated response time

Estimated total burden hours

Year 1

Interview

Museum employees

3

1

1 hour

3 hours

Year 1

Survey

Museum employees

6

1

0.5 hour

3 hours

Year 2 & Year 3

Survey

Museum employees

40*

2

0.5 hour

40 hours

Year 2 & Year 3

Interview

Museum employees

20

2

1 hour

40 hours

Year 2 & Year 3

Focus group

Adult from participating families

480

1

1.5 hours

720 hours

TOTAL

-

520**

-

-

806 hours

*The Year 2 and Year 3 survey will be administered to a representative from each organization participating in the network. We do not yet know the final number of participating organizations, but estimate it will approximately 40.


**We estimate 520 unique participants. There are 480 unique focus group participants (~240 in Year 2; ~240 Year 3). There are 40 unique participants in the interviews and surveys: the same person will complete both the survey and the interview each year.


A.13. Estimates of Cost Burden to Respondents

The estimated total burden hours are 806 hours (see Table 3 above). Below are the burden estimates for each data collection point.


Table 4. Estimates of Cost Burden to Respondents: All Years

 

Mean hourly wage

Estimated
response time
(in hrs)

Average
cost per respondent

# of respondents

Estimated burden

Total estimated burden

Museum Employees

$25.61

0.5

$12.81

86

$1,101.23

$2,202.46

1

$25.61

43

$1,101.23

Families

$24.98

1.5

$37.47

480

$17,985.60

$17,985.60

TOTAL BURDEN

$20,188.06




Burden estimate (Year 1 interview)

Estimated # of participants: 3


Frequency: 1x


Mean Hourly Wage: $25.61 (Bureau of Labor Statistics1 mean hourly wage of a museum employee)


Estimated response time: Hours x # of Respondents x Frequency

[1 hour x 3 x 1 = 3 hours]


Estimated cost/respondent: Hours x Frequency x Mean Hourly Wage of Respondent

[1 hour x 1 x $25.61=$25.61]


Estimated total burden: Hours x # of Respondents x Frequency x Mean Hourly Wage

[1 hour x 3 x 1 x $25.61= $76.83]


Annualized costs to respondents for the one-time, one-hour interview is estimated at $153.66.


Burden estimate: Year 1 survey

Estimated # of participants: 6


Frequency: 1x


Mean Hourly Wage: $25.61 (Bureau of Labor Statistics mean hourly wage of a museum employee


Estimated response time: Hours x # of Respondents x Frequency

[0.5 hours x 6 x 1 = 3 hours]


Estimated cost/respondent: Hours x Frequency x Mean Hourly Wage of Respondent [0.5 hours x 1 x $25.61=$]


Estimated total burden: Hours x # of Respondents x Frequency x Mean Hourly Wage

[0.5 hours x 6 x 1 x $25.61= $153.66]


Annualized costs to respondents for the one-time, 30-minute Year 1 survey is estimated at $76.83.


Burden estimate: Year 2 & Year 3 interviews

Estimated # of participants: 20


Frequency: 2 interviews (one in Year 1, one in Year 2)


Mean Hourly Wage: $25.61 (Bureau of Labor Statistics mean hourly wage of a museum employee)


Estimated response time: Hours x # of Respondents x Frequency

[1 hour x 20 x 2 = 40 hours]


Estimated cost/respondent: Hours x Frequency x Mean Hourly Wage of Respondent

[1 hour x 2 x $25.61=$51.22]


Estimated total burden: Hours x # of Respondents x Frequency x Mean Hourly Wage

[1 hour x 20 x 2 x $25.61= $1,024.40]


Annualized costs to respondents for completing two, one-hour interviews (one in Year 1, one in Year 2) is estimated at $1,024.40.


Burden estimate: Year 2 & Year 3 surveys

Estimated # of participants: 40


Frequency: 2 (one in Year 1, one in Year 2)


Mean Hourly Wage: $25.61 (Bureau of Labor Statistics mean hourly wage of a museum employee)


Estimated response time: Hours x # of Respondents x Frequency

[0.5 hours x 40 x 2 = 40 hours]


Estimated cost/respondent: Hours x Frequency x Mean Hourly Wage of Respondent

[0.5 hours x 2 x $25.61=$25.61]


Estimated total burden: Hours x # of Respondents x Frequency x Mean Hourly Wage

[0.5 hours x 40 x 2 x $25.61 = $1,024.40]


Annualized costs to respondents for completing two, 30-minute surveys (one in Year 1, one in Year 2) is estimated at $1,024.40.


Burden estimate (Year 2 & Year 3 focus groups with families)

# of estimated participants: 480


Frequency: 1x (participants will be different each year)


Mean Hourly Wage: $24.98 (mean hourly wage for all occupations from Bureau of Labor Statistics2)


Estimated response time: Hours x # of Respondents x Frequency

[1.5 hours x 480 x 1 = 720 hours]


Estimated cost/respondent: Hours x Frequency x Mean Hourly Wage of Respondent

[1.5 hours x 1 x $24.98=$37.47]


Estimated total burden: Hours x # of Respondents x Frequency x Mean Hourly Wage

[1.5 hours x 480 x 1 x $24.98= $17,985.60]


Annualized costs to respondents for the one-time, 90-minute focus group is estimated at $17,985.60.


The annualized costs to respondents across all data collection activities and years is $20,188.06


A.14. Estimates of Cost to Federal Government

The annualized cost to IMLS is estimated at $18,000 based on 300 hours at $60.00 for IMLS Museum Services Staff and $17,700 based on 300 hours at $59.00 for IMLS ODIS staff.


A.15. Reason for Program Changes or Cost Adjustments

This is a new submission. There are no program changes or cost adjustments.


A.16. Project Schedule

Table 5 displays the anticipated project schedule. This schedule assumes the work has secured approval from IMLS and OMB.

Table 5. Project Schedule

Project activity

Timeframe

Seek OMB clearance

December–September 2020

Secure IRB approval

February 2020

Develop Year 1 survey protocol

December 2019

Develop interview protocol

December 2019

Develop focus group protocol

December 2019

Interview staff from hub leader orgs*

October 2020

Administer Year 1 survey

October-November 2020

Clean and analyze data

November-December 2020

Submit Year 1 Evaluation Report

December 2020

Revise/update Year 1 survey for use in Years 2 and 3

January 2021

Interview staff from hub, partner, and collaborating orgs

February-March 2021


Conduct focus groups

March-May 2021

Administer Year 2 survey

May-June 2021

Clean and analyze data

June-July 2021

Submit Year 2 Evaluation Report

September 2021

Interview staff from hub, partner, and collaborating orgs

February-March 2022

Conduct focus groups

March-May 2022

Administer Year 3 survey

June 2022

Clean and analyze data

June-July 2022

Submit Year 3 Final Evaluation Report

September 2022

*The sample size for these interviews is below 10 (n=3)


A.17. Request to Not Display Expiration Date

We are not requesting an exemption from the requirements to display the expiration date for OMB approval. All data collection materials and documentation will include the OMB approval number and expiration date.


A.18. Exceptions to the Certification

No exceptions to the certification statement apply to the Building a National Network of Museums and Libraries for School Readiness Project.

Section B. Description of Statistical Methodology

Overview

The Education Development Center will conduct an evaluation of the Building a National Network of Museums and Libraries for School Readiness Project (SRP) in order to document project progress and to identify factors and processes that are key to establishing and sustaining these networks in six states, as well as to inform the scale-up of networks to all 50 states. The following goals will guide the evaluation:

  • Goal 1. Identify institutional capacities and cross-organizational relationships that support model outreach, implementation, and sustainability in order to understand elements and processes that are central to forming, sustaining, and scaling-up the network model in all states.

  • Goal 2. Identify the ways in which the network model prepares and supports hub leaders, key partners, collaborating organizations, and families in promoting academic readiness among young children.

  • Goal 3. Document project activities and implementation of the network model to ensure that the project is on schedule and that activities are being implemented as intended by IMLS and BCM.

The following evaluation questions will guide this work:

  • EQ1: What resources, institutional structures, and cross-organizational relationships support the successful implementation of the existing network model? (Goal 1)

  • EQ2: How do hub leaders, key partners, and collaborating organizations implement the network model? In what ways do they adapt the model to fit their individual contexts and needs, and what successes and challenges do they experience? (Goal 1)

  • EQ3: How do hub leaders, key partners, and collaborating organizations reach families with informal learning opportunities, especially those not currently using museums and libraries? What are the barriers for accessing museums and libraries? (Goal 1)

  • EQ4: What strategies and activities do hub leaders, key partners, and collaborating organizations view as optimal to sustaining existing networks and exponentially growing and adapting the network model to all 50 states? What are some key challenges including internal and external factors that will make it difficult for the current model to sustain and grow? (Goal 1)

  • EQ5: What do hub leaders, key partners, and collaborating organizations view as key factors for school readiness, and what aspects of the network model do they see as supporting their institution’s capacities for supporting school readiness? (Goal 2)

  • EQ6: In what ways, if any, do families view organizations within state networks as supporting their young children's school readiness? (Goal 2)

  • EQ7: To what extent is the project on schedule and are activities being implemented as intended? (Goal 3)


To address these questions, EDC will use a mixed-methods design, pairing quantitative survey data with qualitative interview data.


B.1 Respondent Universe

The program model for the Building a National Network of Museums and Libraries for School Readiness Project (SRP) will be comprised of six state networks. Each network will include: (1) hub leaders (children’s museum or library that serves as the leader of the hub network); (2) key partners (organizations that hub leaders currently partner with); and (3) collaborating organizations (new partner organizations that result from this project). Part of the project work for Building a National Network of Museums and Libraries for School Readiness Project (SRP) is recruiting new organizations to participate in both existing and new state networks.


As shown in Table 6, each state network falls into one of three cohorts. Cohort 1 (Massachusetts) and Cohort 2 (Virginia and South Carolina) were established prior to this grant. Cohort 3 consists of the three states networks (Iowa, Mississippi, and New Mexico) that will be established through this grant. During Year 1 of this three-year cooperative agreement, Boston Children’s Museum (BCM) will recruit and onboard hub leader and partner organizations for state networks in Cohort 3, as well new organizations for state networks in Cohort 1 and Cohort 2. Thus, in Year 1, the evaluation team will collect data from organizations in state networks that are currently part of the state networks in Cohort 1 and Cohort 2. In Year 2 and Year 3, the evaluation team will collect data from organizations in state networks from all cohorts.


Table 6. Timeline of State Network Rollout and Evaluation Activities


State

Year

Established

Data collection timeline

Cohort 1

Massachusetts

2016

Year 1 – Year 3

Cohort 2

Virginia

2018

Year 1 – Year 3

South Carolina

2018

Year 1 – Year 3

Cohort 3

Iowa

2020-2021

Year 2 – Year 3

Mississippi

2020-2021

Year 2 – Year 3

New Mexico

2020-2021

Year 2 – Year 3


The sample sizes we report here are based on an estimate of 40 total organizations spread across the six states networks. We estimate that each state network will include at least one hub leader organization, one partner organization, and one collaborating organization. Finally, we anticipate a respondent universe of families that visit and/or participate in programs at the organizations; however, because we do not yet know all of the organizations that will be participating, it is impossible to estimate that total possible universe of families. Across this population, EDC will complete the data collection activities below. Table 7 summarizes each data collection activity.

  • Year 1

    • Document review of reports and documentation from the previous grants that supported the SRP network model

    • Interview staff lead at each of the three hub leader sites

    • Survey staff lead at each of three hub leader sites and one staff lead at each of three key partner sites

  • Year 2

    • Interview subset of staff leads (n=8) from hub leader sites, a subset of staff leads (n=6) from key partner sites, and a subset of staff leads (n=6) from collaborating sites.

    • Survey staff leads at all hub leader sites, all key partner sites, and all collaborating sites. We do not yet know the final number of sites, but we estimate it will be about 40.

    • Conduct focus groups (n=2) During the Year 2 national meeting, EDC will provide a focus group training to hub and partner organizations, who will conduct their own focus groups. Across six states, we anticipate there will be a total of 30 focus groups per year (60 total), with approximately 8 participants per focus group.

  • Year 3

    • Interview subset of staff leads (n=8) from hub leader sites, a subset of staff leads from key partner sites (n=6), and a subset of staff leads from collaborating sites (n=6)

    • Survey staff leads at all hub leader sites, all key partner sites, and all collaborating sites. We do not yet know the final number of sites, but we estimate it will be about 40.

    • Conduct focus groups (n=2 focus groups; 16 participants in total) During the Year 2 national meeting, EDC will provide a focus group training to hub and partner organizations, who will conduct their own focus groups. Across six states, we anticipate there will be a total of 30 focus groups per year (n=60 focus groups total), with approximately 8 participants per focus group (n=480 participants in total).


Table 7. Summary of Data Collection Activities

Eval Question (Goal)

Method

Method

Participant Group(s)*

Date of Data Collection

Corresponding Question(s) from instruments**

EQ1

(Goal 1)

Document review

Review of reports and documentation from previous grants

n/a

Year 1

n/a

Interview I

Video conferencing app

(e.g., Zoom)

Staff from hub leader organizations

Year 1 – Year 3


Q4 - Q6; Q21 

Interview II

Staff from key partner/collaborating organizations

Year 2 – Year 3

Q4 -Q6; Q20

Survey I

Web-based survey tool (e.g., Qualtrics)

Staff from hub leader and key partner organizations

 Year 1

Q10

EQ2

(Goal 1)

Interview I

Video conferencing app

(e.g., Zoom)

Staff from hub leader organizations

Year 1 – Year 3


Q2 - Q5; Q12; Q14; Q18

Interview II

Staff from key partner/collaborating organizations

Year 2 – Year 3

Q2; Q4 - Q5; Q12; Q14; Q17 

Survey I

Web-based survey tool (e.g., Qualtrics)

Staff from hub leader and key partner organizations

 Year 1

 Q9 - Q13; Q18

Survey II**

Staff from hub leader and key partner/collaborating organizations

 Year 2 – Year 3

 Q9 - Q13; Q18

EQ3

(Goal 1)

Interview I

Video conferencing app

(e.g., Zoom)

Staff from hub leader organizations

Year 1 – Year 3

Q9; Q11; Q15,

Interview II

Staff from key partner/collaborating organizations

Year 2 – Year 3

 Q9; Q11; Q15

EQ4

(Goal 1)

Interview I

Video conferencing app

(e.g., Zoom)

Staff from hub leader organizations

Year 1 – Year 3


 Q4 - Q5; Q19

Interview II

Staff from key partner/collaborating organizations

Year 2 – Year 3

 Q4 - Q5; Q19

EQ5

(Goal 2)


Interview I

Video conferencing app

(e.g., Zoom)

Staff from hub leader organizations

Year 1 – Year 3

 Q6; Q8 - Q9; Q11; Q13; Q16

Interview II

Staff from key partner/collaborating organizations

Year 2 – Year 3

Q6 - Q7; Q11; Q13; Q16

Survey I

Web-based survey tool (e.g., Qualtrics)

Staff from hub leader and key partner organizations

 Year 1

 Q14 - Q17

Survey II**

Staff from hub leader and key partner/collaborating organizations

 Year 2 – Year 3

 Q14 - Q17

EQ6

(Goal 2)

Focus group***

In-person

Adult from family participating in the SRP through hub and partner organizations

Year 2 – Year 3

All

EQ7

(Goal 3)

n/a****

*Hub refers to the statewide partnerships between and across museums, libraries, community organizations, and early care and educator provider networks. Hub leaders are the children’s museum or library that serves as leader of the hub. Key partners are organizations that hub leaders are currently partnering with. Collaborating organizations are new key partner organizations that join the hub as a result of this project


**Survey II will include items from Survey I, along with additional items we develop as a result of findings from Year 1 data collection. For the purposes of this table, Survey II question #’s in the last column refer to question #’s from Survey I


***The EDC evaluation team will conduct four focus groups (two in Year 2; two in Year 3). During the Year 2 national meeting, EDC will provide a focus group training to hub and partner organizations, who will conduct their own focus groups. Across six states, we anticipate there will be a total of 30 focus groups per year (60 total), with approximately 8 participants per focus group.


****EDC will address EQ7 by documenting BCM’s progress in carrying out the project activities. Note that there are no data collection activities associated with EQ7; rather, EDC will address this evaluation questions through updates from BCM via email correspondence


B.2. Potential Respondent Sampling and Selection Methods

In order to identify institutional capacities and cross-organizational relationships that support successful model implementation and to identify the ways in which the network model prepares and supports organizations and families in promoting academic readiness, we will conduct annual surveys with a staff leads from ALL participating organizations (i.e., the entire universe of respondents). The universe of Year 1 respondents will include organizations in the state networks that are part of Cohort 1 and Cohort 2 (see Table 5). Organizations that are part of the Cohort 3 state networks will be onboarded by Boston Children’s Museum at the end of Year 1, and therefore will not part of the Year 1 respondent universe. The universe of respondents in Year 2 and Year 3 will include organizations from all state networks across all cohorts. We will survey the same staff lead each year (assuming the staff lead has not left the organization or changed roles). Since we will be surveying all staff leads in all participating organizations, sampling is unnecessary.


To capture variation in model implementation and experiences across state networks, local contexts, and program levels, each year we also will conduct semi-structured interviews with staff leads from a subset of hub leader, key partner, and collaborating organizations across the state networks. The sample frame for the Year 1 interviews (n=3) will consist of the staff leads at hub leader organizations in the existing state networks (see Cohort 1 and Cohort 2 in Table 5). In Year 2 and Year 3, the sampling frame for the interviews will consist of the staff leads from hub leader organizations (n=8 per year; 16 total), key partner organizations (n=6 per year; 12 total), and collaborating organizations (n=6 per year; 12 total) across all state networks and cohorts. In Year 2 and Year 3, focus groups will be conducted (n=30 focus groups per year; 60 focus groups total) with a subset of families (n=240 families per year; n=480 total). The sample frame will include families who engage with hub leader, key partner, and collaborating organizations across all states and cohorts. Note that the EDC evaluation team will conduct four of the focus groups (two in Year 2; two in Year 3). During the Year 2 national meeting, EDC will provide a focus group training to hub, partner, and collaborating organizations, who will conduct their own focus groups. Across six states, we anticipate there will be a total of 30 focus groups per year (60 total), with approximately 8 participants per focus group. To select the subsets for the interviews and focus groups, we will employ purposive sampling, specifically maximum variation sampling.3 This sampling approach allows us to maximize the diversity of responses and learn about implementation across a heterogenous group of settings. Boston Children’s Museum (BCM) worked with IMLS to identify a new cohort of states to implement the network model. BCM and IMLS sought states with diversity related to geography, community type (urban, rural, tribal) and populations served (dual language households). EDC will select the sub-sample for interviews and focus groups based on these three characteristics, making sure the final sub-sample is representative of this diversity. We will make every effort to ensure that our interview sample is representative of these characteristics; however, to account for the possibility of selection bias, we will compare the characteristics of any organization that opted not to participate in interviews with the characteristics of the organizations that did. All analysis and reporting will document the extent and nature of these differences.


B.3. Response Rates and Non-Responses

We anticipate high response rates across all data collection activities (between 85% and 100%) given the close working relationship between Boston Children’s Museum and the participating organizations and the small number of respondents. To reach these response rates, we will follow recommendations from the literature.4,5 For example, to foster increased participation, Boston Children’s Museum and EDC will provide participating organizations with a detailed overview of evaluation activities at the yearly meetings, establish strong channels of communication, and provide adequate notification and time to complete each data collection activity. We recognize that missing data can undermine the findings of an evaluation. If the response rates to the survey fall below 80% (response rate threshold recommended by OMB), we will conduct missing data analysis to examine if the data are missing at random or if there are differences in the characteristics between organizations that responded and those that did not respond. If we find that there are differences and that the data are not missing at random, we will select the appropriate procedures for handling missing (e.g., weighting).


B.4. Tests of Procedures and Methods

In developing semi-structured instruments for the interviews, focus groups, and Year 1 survey, we drew on and adapted items from existing instruments from current and previous work, creating new items as necessary. In developing the survey, we included the required performance measure items from IMLS.6 Furthermore, we drew from literature related to emergence,7 social innovation,8,9 social network analysis10,11 and social-emotional learning12,13. EDC will use findings that emerge from the Year 1 interviews and survey to refine and revise the survey for Years 2 and 3. For example, we will likely do an analysis of open-ended items from the Year 1 survey to develop close-ended items for the revised survey.


Data analysis

Analysis of quantitative data. We will use statistical software (such as STATA) to conduct descriptive analyses of close-ended survey items. After data have been cleaned, researchers will calculate means and standard deviations for continuous measures and frequency tables for discrete measures.

Analysis of qualitative data. Data from the document review, interviews, and focus groups will be transcribed and analyzed using qualitive coding software (such as Dedoose). We will conduct a content analysis, which is a systematic analytical technique that is particularly useful for analyzing text data.14 Given that research on the processes and principles for establishing and sustaining networks across libraries and museums is limited, we will follow the conventional approach to content analysis. Using this inductive approach, two researchers will engage in multiple reviews of the data. Through these initial reviews we will identify overarching themes related to our research questions and generate a coding scheme that we will apply to the data during a second round of review. To ensure consistency across coders, we will double-code a subset of data, discussing and resolving differences as necessary.



B.5. Contact Information for Statistical or Design Consultants


EDC

Project Director: Wendy Martin, Research Scientist, [email protected]

Project Lead: Michelle Cerrone, Senior Research Associate, [email protected]


IMLS

Reagan Moore, Senior Program Officer, [email protected]

Marvin Carr, Evaluation Officer, [email protected]


3 Teddlie, C., & Yu, F. (2007). Mixed methods sampling: A typology with examples. Journal of mixed methods research1(1), 77-100.


4 Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys: the tailored design method. John Wiley & Sons.

5 Monroe, M. C., & Adams, D. C. (2012). Increasing response rates to web-based surveys. Journal of Extension50(6), 6-7.

6 Institute of Museum and Library Services (2019). National Leadership Grants for Museums: FY 2019 Notice of Funding Opportunity. (IMLS-CLR-D-0024). Retrieved from: https://reginfo.gov/public/do/DownloadDocument?objectID=84159201

7 Wheatley, M., & Frieze, D. (2006). Lifecycle of Emergence–Using Emergence to Take Social Innovation to scale.[on-line]. Dostupnopreko: http://www. margaretwheatley. com/articles/emergence. html [11. 2. 2010.].

8 Ayob, N., Teasdale, S., & Fagan, K. (2016). How social innovation ‘came to be’: Tracing the evolution of a contested concept. Journal of Social Policy45(4), 635-653.

9 Mulgan, G., Tucker, S., Ali, R., & Sanders, B. (2007). Social innovation: what it is, why it matters and how it can be accelerated.

10 Carrington, P. J., Scott, J., & Wasserman, S. (Eds). (2005). Models and methods in social network analysis (Vol. 28). Cambridge university press.

11 Freeman, L. (2004). The development of social network analysis. A Study in the Sociology of Science1, 687.

12 Catalano, R. F., Berglund, M. L., Ryan, J. A., Lonczak, H. S., & Hawkins, J. D. (2004). Positive youth development in the United States: Research findings on evaluations of positive youth development programs. The annals of the American academy of political and social science591(1), 98-124.

13 Collaborative for Academic, Social, and Emotional Learning [CASEL]. (2005). Safe and sound: An educational leader’s guide to evidence-based social and emotional learning programs – Illinois edition. Chicago, IL.

14 Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative health research15(9), 1277-1288.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPaula Gangopadhyay
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy