07-28-15 MCE Supporting_Statement_(Part_B)_response

07-28-15 MCE Supporting_Statement_(Part_B)_response.docx

Evaluation of My Classroom Economy Program

OMB: 1505-0252

Document [docx]
Download: docx | pdf

Request for OMB Approval for Generic Clearance for Collection of Information

Evaluation of the My Classroom Economy Program



B. STATISTICAL METHODS

  1. Universe and Respondent Selection

The research team will recruit one or two medium to large school districts to participate in the study. The team is targeting a study sample size of approximately 2,000-3,000 students. No sampling or other statistical methods will be used to select the survey respondents or extrapolate or generalize the study results to a wider population. All eligible students in the interventions will be invited to participate in the study, and all study participants will be asked to respond to baseline and follow-up surveys.

The research team will administer a baseline survey for the parents of the students, which will also be used to obtain consent for the student’s participation in the study. Based upon past experience, the team expects a 40-50% response rate for the parent survey/consent form.

Study participation will be completely voluntary, and those students who choose to participate will be surveyed in person on-site at school. While all students will take the in-class surveys, only those whose parent’s give consent will have their data analyzed for this research. The research team is targeting a student population of 2,000-3,000 students so that it will have sufficient data after accounting for the consent rate. The team will analyze results (along with available administrative data) for consenting students only to provide information on the impact of MCE. The findings will therefore be specific to the interventions and populations tested.

  1. Procedures for Collecting Information

Research Design and Implementation

This study compares outcomes from students that do and do not participate in MCE. The nature of MCE as an interactive classroom activity necessitates that it be implemented at the classroom rather than student level. Given that classrooms participating in the MCE evaluation will be randomly selected, that MCE access is randomized at the classroom level, not at the student level Therefore, the evaluation will not be a pure randomized control trial, and the research team will use statistical strategies to control for baseline differences between the treatment groups as well as any correlation in the outcomes caused by the program being delivered at the classroom level.

Measurement

The student surveys will be conducted in class. They are financial assessments evaluating financial knowledge, attitudes, and behaviors relevant to the participants’ age level. A first assessment will be administered at the beginning of the school term. A second assessment will be administered towards the end of the school term. Thus, the primary outcomes of interest will be changes in responses associated with the intervention.

In addition, the research team will collect from the schools administrative data about participating students’ grades, attendance, and behavioral issues. Finally, the research team will also collect records of how students who participate in MCE spend their “money.” We will be most interested in how these outcomes change throughout the program.

The research team will also survey parents of students in the study at baseline to gather data on family demographics and financial attitudes and behaviors.

The team will also collect information from the teachers in the participating classrooms. The teachers will complete a brief survey about their experience delivering financial education and impressions of MCE.

Data Analysis

The research team will use a difference-in-differences approach to compare how student outcomes among the treatment and control groups differ, demonstrating whether MCE is associated with differences in financial knowledge gain and attitude/behavior change. Subgroup analysis may allow the research team to examine whether impacts differ across subpopulations if the sample size is sufficient.

The research team will likely explore multiple estimation strategies, including ordinary least squares (using classroom fixed effects and clustering standard errors at the school level) and potentially hierarchical linear modeling. The team will also control for baseline school and child characteristics.

The evaluation includes have 5 instruments, where 1-3 are variations on a common set of questions

  1. Student Baseline PRE TEST (see Exhibit B)

  2. Student Follow Up POST TEST: TREATMENT

  3. Student Follow Up POST TEST: CONTROL

  4. Teacher Survey (see Exhibit C)

  5. Parent Survey (see Exhibit D)


We expect the following response rates to the surveys:


Instrument

PREDICTED Response Rate

PREDICTED Number Responses

Student Baseline (Pre Ex B)

95%

2470

Student Post Treatment (Ex B)

95%

2470

Student Post Control (Ex B)

95%

2470

Teacher (see Exhibit C)

95%

114

Parent (see Exhibit D)

80%

2080


Non-response bias is minimized because the student and teacher surveys are mandated by the District. Some students will change classrooms, leave their school between the baseline and follow-up surveys, or be absent on the days of the survey administration, so non-response is estimated at 5%. Non-response to the teacher surveys is also expected to be minimal. Parent non-response is estimated at 20% based on past studies.

With the target sample size of 2,000-3,000 students and expected consent rate, the research team assumes each of the treatment and control groups will have between 500 and 1500 students each. As the table below shows, the study is powered to detect economically meaningful treatment effects.

Size of test=0.05

Minimum Detectable Effect (sigma units)

Students per Group

Power=0.8

Power=0.9

500

0.177

0.205

750

0.144

0.167

1000

0.125

0.145

1250

0.112

0.13

1500

0.102

0.118



  1. Methods to Maximize Response

In order to achieve our desired sample size, the research team will need to recruit participants effectively from the pool of potential participants and maximize participants’ likelihood of responding to surveys. Parents of potential study subjects will receive consent forms in the mail and forms will also be sent home from school with children. In past studies of youth financial education the research team has found that parental consent rates increase when they are asked to complete a brief survey rather than only sign and return a consent form. We suspect this is because parents feel more engaged with and informed of the study when they are asked to participate as well. Thus, the team will use the parent survey both to boost study consent and because we value the data provided by the parents for research purposes. The team will also provide a small monetary incentive (approximately $5) for completing the survey.

In addition, the research team will improve response rates by informing parents of the study and building interest in advance of the mailing; parents will learn about the study through school newsletter articles and letters sent home with students. The team will use local media to also build awareness and excitement for the research project.. All students will complete the pre/post assessments (though the data will only be used when parents have consented). Teachers will be provided with detailed instructions for administering the assessments to improve responses.

  1. Testing of Procedures

The survey questions were mainly taken/adapted from exiting literature and financial literacy assessments. Sources include the Financial Fitness for Life curriculum, PISA Financial Literacy Assessment, Jump$tart Coalition, Junior Achievement Economics for Success, Generalizable Scale of Propensity to Plan, Financial Self-Efficacy Scale, Money Matters on Campus, Federal Reserve financial literacy test, Financial Football, CUNY Student Financial Dollars and $ense.



  1. Contacts for Statistical Aspects and Data Collection

Dr. J Michael Collins
Faculty Director, Center for Financial Security at the University of Wisconsin-Madison
608-262-0396

3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorReference
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy