OMB package B FINAL_091312

OMB package B FINAL_091312.docx

Process and Impact Evaluation of the Minnesota Reading Corps

OMB: 3045-0144

Document [docx]
Download: docx | pdf

TABLE OF CONTENTS




B. Statistical Methods


B.1. Respondent Universe and Sampling Methods

CNCS’ contractor, NORC at the University of Chicago, will collect information for the Process and Impact Evaluation of the Minnesota Reading Corps (MRC) on behalf of CNCS. The contractor is responsible for the design and administration of the feasibility and process site visits and the baseline web-based survey of AmeriCorps member applicants.


As presented in Part A, the data collection is being conducted for the first two phases of the project (Phase 1- Feasibility Study and Phase II - Process Assessment) and include the following data collection instruments to be reviewed:

  • Feasibility Interview Protocols: Site visits to 50 sampled MRC programs to assess their suitability for participation in and inform them about the implementation of the later impact evaluation. The sites selected would be evaluated for participation in the later random assignment study to take place in Phase III of the study;

  • Process Assessment Interview Protocols and Focus Group Guide: Site visits to up to 20 MRC programs to better understand variations in the MRC model and how to replicate the program in other locations; and

  • Baseline Member Applicant Survey: A baseline survey with new applicants to the 2012-13 MRC program. Approximately 2,000 subjects will be asked to participate in the survey, of which approximately 1,000 will be MRC members (treatment group) and 1,000 will be individuals who were selected as alternates (comparison group) by the program. Of this number, we anticipate obtaining responses from 80 percent of the sample (1,600 respondents).


Phase III - Impact Evaluation will be funded in FY2013 based on the findings from the current feasibility study.


The MRC program currently serves nearly 15,000 students across 450 schools by recruiting 800 AmeriCorps members (literacy tutors) in the 2011-12 school year, making it one of the largest AmeriCorps programs in the country. For the 2012-13 school year, MRC plans to recruit 1,000 AmeriCorps members to serve as literacy tutors in close to 500 schools and other educational institutions (Head Start centers, etc.). The program annually receives over 3,000 applicants to serve as AmeriCorps members.


B.2.Information Collection Procedures

The feasibility and process evaluation site visit protocols will be conducted face-to-face during in-person site visits with the program’s internal coach, the school principal, teachers, and AmeriCorps members. MRC will first contact and notify the schools selected for the site visits. The project team will then contact by phone the internal coach, who serves as the program coordinator, about scheduling the site visit and the individual interviews. Interviews will take place at their school in a private and quite area. This area may be a classroom, office, the library, or small conference room. Verbal consent will be obtained prior to starting the interview. Interviews will last for approximately 30 to 75 minutes. Survey participants and interviewees will not receive any form of monetary or tangible compensation for their participation in the study. A teacher focus group lasting 45 minutes may be conducted if a number of teachers at one site are available to participate in discussions. If a focus group is conducted it would replace individual interviews with teachers. It is anticipated that feasibility site visits will take place in summer 2012 and process site visits in the fall 2012. Each feasibility site visit will take approximately one half day to complete and each process assessment site visit will take approximately one day on-site to complete.

With regard to the feasibility site visits, the contractor will interview one principal, one internal coach, up to three teachers, and up to four AmeriCorps members at approximately 50 sites. The contractor will interview one principal, one coach, up to five teachers (to be interviewed or participate in a focus group), and up to four AmeriCorps members at approximately 20 sites for a total of 220 individuals for the process evaluation site visits. We will likely only conduct one focus group with teachers at each site and only if a number of teachers are available at the same time for a discussion.


In addition, individuals selected to serve as an MRC literacy tutor or an alternate will be asked to complete a member survey. CNCS will send an email invitation containing a user ID and password that asks applicants to complete the survey in July of 2012. If they are interested in completing the survey, they can click on a survey link. The first screen they see will ask them to enter a user ID and password to complete the survey. Upon doing so, the next screen provides a brief overview of the study, asks for their voluntary participation, informs participants about confidentiality and privacy, provides a frequently asked questions link, provides a link to a detailed description of the project, and provides a toll-free telephone number and email address if participants have any questions about the survey. By clicking "Next" at the bottom of the consent screen, the survey participant is providing their voluntary consent to participate in the survey. Actual time to complete the web-based baseline survey may vary. However, on average, we estimate it will take approximately 20 minutes to complete. Survey participants who complete the baseline member survey will not receive any form of monetary or tangible compensation for their participation in the study. A follow-up survey will be conducted in a later phase of the study which is yet to be funded



B.2.1. Statistical Methodology for Stratification and Sample Selection


Selecting Feasibility Sites. Although the sampling frame will need to be inclusive of many sites, not only those best implementing the program model, we will limit the sampling frame so the sites eligible for selection are appropriate for the evaluation, while also continuing to be representative of the program. We consulted several individuals familiar with the MRC sites, including MRC staff, in developing our sampling plan. With their assistance, we decided to limit the final sampling frame to fully implemented sites within a three-hour radius of the Minneapolis/St. Paul area and to stratify sites by urbanicity.


Because school environments vary greatly due to the urbanicity of their location, we will stratify sites by urban, suburban, and rural location. The only areas in the state of Minnesota that can be categorized as “urban” and “suburban” are in and immediately around the cities of Minneapolis and St. Paul, so the Metro region of the MRC program that encompasses these areas would be included in the study sampling frame. In addition, we were advised by MRC staff that the rural areas in the state of Minnesota, both immediately outside the Metro region and in the farther reaches of the state, are very similar. So we will limit the sampling frame for rural areas to those locations just outside the Twin Cities area (Central, Southwest, and Southeast regions). Limiting the sample to sites within a three-hour radius will better enable the project team to conduct initial screening visits and provide regular monitoring during the random assignment process. Finally, we will only consider schools that have fully implemented the MRC program. Therefore, new schools that will join the MRC program for the 2012-13 school year and schools that have not had an MRC program operating for at least two consecutive years will not be eligible for selection.


Once the sampling frame is finalized, we will select up to 50 sites for inclusion in the study and 50 alternate sites. During the site selection process, we will use Probability Proportional to Size (PPS) whereby larger schools (defined as the number of students served by MRC) have a higher probability of selection to ensure we meet our sample size requirements.


Selecting Process Sites. Using information provided by MRC on several demographic variables on individual schools and other education institutions participating in the program, we have selected a diverse group of sites based on the following information:


  • Regional membership. We have ensured that our selection of institutions is distributed across the four MRC regions.

  • Geographic diversity. In congruence with our efforts to select schools that represent the different MRC regions, we have selected schools that provide a mix of rural, urban, and suburban districts.

  • Program type. We have identified institutions across each of the program types, preK and K-3 only.

  • Institution type: For the pre-K programs, we have included some Headstart programs, as well as programs based in public schools.

  • Enrollment levels. We have selected schools that are both small and large in terms of the size of their student body.

  • Concentration of poverty: Although the majority of the institutions where MRC programs are currently implemented report high percentages of students who are eligible for FRPL, we have attempted to include a mix of schools with lower (<34%), medium(34-67%), and high (> 67%) percentages of FRPL students.

  • Previous experience teaching MRC endorsed content domains. We would like to explore whether there appears to be greater change in schools that did not previously emphasize the “Big Five” reading components that comprise the MRC programs.

  • Level of experience: We have selected both a mix of newer and experienced programs to address both implementation issues and observe more established programs.

  • Proximity to other programs. Although each visit will be focused primarily on a single school, where possible, we will combine visits to multiple institutions into one central location so more than one site visit can take place over several days.


Demographic information was supplemented by discussions with the regional coordinators for each of the four MRC regions where site visits will take place in order to identify schools for potential visits. During these discussions, coordinators were asked to identify schools in their region for potential site visits that meet one or more of the following criteria:

  • Recently implemented their programs successfully and are currently serving students;

  • Recently struggled to implement their programs and why;

  • Experience particular environmental challenges facing many schools with MRC programs (e.g., high percentage immigrant population, low parental involvement, high FRPL population, etc.);

  • Serve as an example of where the components of the MRC curricula was not previously emphasized or implemented;

  • Have incorporated different or innovative strategies in their learning environments to facilitate the MRC program.


Using both the list of schools generated by the MRC regional coordinators as well as the administrative data from MRC the list was narrowed to 20 schools to visit in Fall 2012.


Member Selection for Baseline Survey. All applicants selected as tutors and alternates for the 2012-2013 MRC program will be asked to participate in the baseline member survey. Approximately 2,000 subjects will be asked to participate in the baseline member survey, of which approximately 1,000 will be MRC members (treatment group) and 1,000 will be individuals who were selected as alternates (comparison group). A follow-up survey will be conducted in a future phase of the study. All respondents will be over the age of 18. No students will be part of any phase of the data collection. Respondents will be male and female and will consist of all races and ethnicities.

The table below provides an estimated timeline of data collection activities.


Activity

Estimated Start Date

Estimated End Date

Launch AC member baseline web-survey

July 2012

September 1, 2012

Send Reminder Emails

July 2012

July 2012

Phone Prompting 1

July 2012

August 2012

Phone Prompting 2

August 2012

August 2012

Conduct site visits and recruitment

for the feasibility study

July 2012

August 2012

Conduct site visits for the process assessment

September 2012

October 2012



B.2.2. Estimation Procedure


B.2.3. Degree of Accuracy Needed for the Purpose Described in the Justification


For the feasibility study, CNCS' contractor will conduct site visits with one principal, one coach, up to three teachers, and up to four AmeriCorps members at 50 sites which totals 450 individuals. For the process assessment, the contractor will conduct interviews with principal, one coach, up to five teachers, and up to four AmeriCorps members at 20 sites for a total of 220 individuals.

Sample Size Needs. An important consideration prior to developing our sampling plan for selecting the feasibility study sites was the number of students required to detect a difference between the treatment and control groups. The outcome used to assess the impact of the program will be student test scores. the pre-K and K-3rd programs that encompass MRC are very different in terms of both their activities and the outcomes used to measure impacts. Therefore, we will be conducting two analyses, one for each program type. To determine the number of sites to sample for each program type, we conducted a power analysis using the student, member, and school population numbers reported by MRC from the 2009-2010 school year. Assuming a Minimal Detectable Effect (MDE) of 15% within each major program type (pre-K and K-3rd), the study would require 600 treatment and 600 control members within the K-3rd program. The study would require 990 treatment members within the pre-K program. This analysis assumed an intraclass correlation coefficient (ICC) of 0.15, based on current literature on ICC values for sample designs with school-based clusters.


Based on information provided by MRC that AmeriCorps members tutor between 25 and 40 students per year and a minimum of one member serves in each school, we have assumed 30 treatment group students (and 30 control group students) on average will be available per site for the K-3rd program. This would require approximately 25 schools to be sampled. The pre-K program assumed 45 students per school on average, and would require 40 schools to be sampled. Thus, we would require roughly 50 sites in order to obtain our goal of 2,190 treatment and control members. There will be 15 school selected that have both a pre-K and K-3rd program, 10 schools with a K-3rd program, and 25 schools with a pre-K program. We will also sample 50 alternate schools if any of the initial 50 schools sampled cannot participate in the study for any reason.


For the Web-based survey, a targeted sample size of 2,000 was selected to enable us to conduct our analysis. As described in B.1, eligible respondents include subjects who newly apply to be an MRC tutor and are selected to either serve in the program (treatment) or have been identified as an alternate tutor (comparison). The sample size, which will be representative of the population, will then be used to generate frequencies and means. We anticipate an 80 percent response rate for the web-based survey. We consider this response rate to be a reasonable estimate because the survey takes a short amount of time to complete (20 minutes based on a pretest conducted with current AmeriCorps members), respondents are well-immersed in the use of email and the Web, and there is a high-level of enthusiasm among applicants for the MRC program. However, if a lower than expected response rate results, we will conduct non-response bias tests to determine if any bias resulted from the lower response rate. If these tests provide evidence of bias, we will make adjustments to our results with the use of weight adjustments and/or response imputation.


B.2.4. Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems requiring specialized sampling procedures.


B.2.5. Use of Periodic (Less Frequent Than Annual) Data Collection Cycles

There are no periodic data collection cycles associated with this study. The process assessment, field assessment, and baseline are one-time data collections.


B.3. Methods to Maximize Response Rates


MRC will first contact and notify the principal and internal coach at the schools selected for the feasibility and process assessment site visits by email. This introductory email will convey the importance of the site visits in several areas: to better understand the MRC program and its ability to improve literacy among students, to allow for its replication in other areas of the country so that it may serve more students, and to determine whether the MRC program is furthering CNCS’s mission of strengthening national and community service. The MRC staff have established working relationships with the schools in which they place tutors. Having the initial email come from a respected and familiar source will be highly effective in helping to establish the legitimacy and importance of the site visits. Shortly after MRC sends the initial email, the project team will then contact by phone the internal coach, who serves as the program coordinator, about scheduling the site visit and individual interviews. This initial contact will rely on a script that will reiterate the importance of this research and why the school’s participation is critical to the objectives of the study. The timing of the site visits and individual interviews will be scheduled in such a way as to minimize any inconvenience to the schools and the interviewees. For example, discussions with teachers will be scheduled over a break as to not interfere with teaching. Moreover, a teacher focus group lasting 45 minutes may be conducted, rather than individual interviews, if doing so would ease participation for the teachers at a site. Given the enthusiasm for and popularity of the MRC program on the part of the schools, we anticipate a high level of cooperation in scheduling the site visits.


We anticipate an 80 percent response rate for the web-based survey. We consider this response rate to be a reasonable estimate because the survey takes a short amount of time to complete (20 minutes based on a pretest conducted with current AmeriCorps members), respondents are well-immersed in the use of email and the Web, and there is a high-level of enthusiasm among applicants for the MRC program. However, if a lower than expected response rate results, we will conduct non-response bias tests to determine if any bias resulted from the lower response rate. If these tests provide evidence of bias, we will make adjustments to our results with the use of weight adjustments and/or response imputation.


For ensuring a high response rate, CNCS will send an email invitation to those individuals who have applied to be a MRC literacy tutor containing a user ID and password that asks applicants to complete the survey in July of 2012. This email will convey the importance of the survey and how the information will be used. If they are interested in completing the survey, they can click on a survey link. The first screen they see will ask them to enter a user ID and password to complete the survey. Upon doing so, the next screen provides a brief overview of the study, asks for their voluntary participation, informs participants about confidentiality and privacy, provides a frequently asked questions link, provides a link to a detailed description of the project, and provides a toll-free telephone number and email address if participants have any questions about the survey. After two weeks and three weeks, a reminder invitation email will be sent to respondents who have not yet responded to the survey reminding them of the importance of their participation, why the results are important and how they will be used, and including the information necessary for them to complete the survey. At four weeks, reminder telephone calls will be placed to prospective respondents who have not yet responded to the survey. During the final two weeks of data collection, prospective participants who have still not yet responded to the survey will be called again, prompting them that the survey is about to end and that their participation is very important. Interviewers highly experienced with gaining cooperation will be used to make these calls.


B.4. Tests of Procedures


The baseline survey instrument has been drafted and has undergone two reviews: (1) an internal review conducted by NORC’s Institutional Review Board and (2) a pre-test with five AmeriCorps members currently serving in the MRC. In order to accurately determine the burden placed on respondents as well as further test the clarity of the survey questions, a pretest was conducted in which a total of five students from diverse backgrounds responded to the survey to assess the reliability of the instrument. Slight revisions were made to the order and wording of a small number of questions based on comments received from both of these reviews.

Modifications to the length, content, and structure of the baseline survey have been made based on the results of the survey pretest interviews. Respondents provided generally positive feedback indicating that they could readily answer the questions and that the time to complete the survey was not onerous (approximately 20 minutes).


The feasibility protocols, process evaluation protocols, and web-survey instrument have been drafted, reviewed and approved by all project staff and have undergone an internal review conducted by NORC’s Institutional Review Board. After programming the instrument for the web, the survey instrument will undergo beta testing by all project staff prior to the launch of the questionnaire.



B.5. Statistical Consultants

The information for this study is being collected by NORC, a research organization, on behalf of CNCS. With CNCS oversight, the contractor is responsible for the study design, instrument development, data collection, analysis, and report preparation.


The instrument for this study and the plans for statistical analyses were developed by CNCS and its contractor. The staff team is composed of Dr. Carrie Markovitz and Dr. Marc Hernandez, Co-Principal Investigators and a team of senior-level staff including Dr. Carol Hafford, Task Leader for the Process Evaluation and Heidi Whitmore, Task Leader for the Web Survey. Contact information for these individuals is provided below.



Name

Phone Number

Carrie Markovitz, PhD

301-634- 9388

Marc Hernandez, PhD

773-256-6152

Carol Hafford, PhD

301-634- 9491

Heidi Whitmore, MS

763-478-6725


Page | 7

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleREQUEST FOR CLEARANCE FOR
AuthorDHHS
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy