1850-NEW PISCE OMB Supporting Statement - Part A 4 1 16 clean

1850-NEW PISCE OMB Supporting Statement - Part A 4 1 16 clean.docx

Parent Information and School Choice Evaluation

OMB: 1850-0927

Document [docx]
Download: docx | pdf


PARENT INFORMATION AND SCHOOL
CHOICE EVALUATION

Request for OMB Clearance
OMB#

Supporting Statement Part A

National Center for Education Evaluation

U.S. Department of Education

Institute of Education Sciences

Washington, DC


January 2016

CONTENTS

Preface iv

A. JUSTIFICATION 1

A.1 Importance of information 1

a. Overview of study 1

b. Experiment and data collection 2

A.2 Purposes and uses of data 4

A.3 Improved information technology (reduction of burden) 4

A.4 Efforts to identify duplication 4

A.5 Minimizing burden for small entities 4

A.6 Consequences of not collecting data 4

A.7 Special circumstances 4

A.8 Federal Register announcement 5

a. Federal Register announcement 5

b. Consultations outside the agency 5

A.9 Payments or gifts to respondents 5

A.10 Assurance of confidentiality 6

A.11 Sensitive questions 6

A.12 Estimates of burden 6

A.13 Estimates of Cost Burden to Respondents 7

A.14 Annualized Cost to Federal Government 7

A.15 Program changes or adjustments 7

A.16 Plans for tabulation and publication 7

A.17 Display OMB expiration date 9

A.18 Exceptions to certification statement 9




TABLES

Table A.1. Factors to be tested, mapped to research questions 3

Table A.2. Types of School Choice Information to Display 3

Table A.3. Estimates of recruitment burden for parents of school-age children 7

Table A.4 Framework for testing factor combinations and contrasts in an illustrative 2 x 2 x 2 design 8















Preface

Sponsored by the Institute of Education Sciences (IES), U.S. Department of Education, the Parent Information and School Choice Evaluation (PISCE) is an important first step toward filling the wide gap in knowledge about how to present school choice information to parents. This research is needed to provide guidance to districts where school choice is expanding. PISCE seeks to identify the format, amount, and organization of information that is most comprehensible and usable to parents. The study will target low-income parents of school-age children and will evaluate perceptions of different presentations of school information. The results of the study will be used to create a reader-friendly guide for school districts.1

IES has contracted with Mathematica Policy Research to conduct the needed research. Most of the experiment will be conducted with members of a standing panel who already complete surveys on a regular basis for a variety of purposes. This approach provides a low-cost and quick turnaround method to obtain findings related to the understandability of school choice information, which does not require respondents to be making actual school choices for their children. To enhance what can be learned from the standing panel, the research team also intends to recruit a sample of low-income parents of school-age children from locations where a public school choice marketplace with unified enrollment has been active for at least two years. Parents who have experienced public school choice or are at least exposed to open enrollment in their district may experience the experiment differently than the standing panel members, for whom considering schools other than one’s default neighborhood school may be unfamiliar. This augmented sample of, presumably, less survey-savvy low-income parents will be used to provide a sensitivity check of the findings based on the standing panel alone. IES is submitting this clearance package which requests approval for the study’s recruitment and survey activities.



A. JUSTIFICATION

A.1 Importance of information

School choice has increased dramatically in recent years through the expansion of charter schools and open enrollment in traditional districts. School choice can only be effective policy if parents are able to navigate school choice systems, follow application procedures, and process large amounts of complex information to make informed choices about the best schools for their children. The rise of new technology and data systems has led to an explosion of such information, but the school choice marketplace has yet to determine the best ways to curate and present this information to parents. In particular, there is scant research-based guidance that school districts and related entities can rely on when making school choice information available to parents, and each new district that enters this policy arena has had to muddle through the process using trial and error. Further, ED’s school choice programs supporting magnets, charter schools, and vouchers have identified parent information as a barrier to greater participation in these programs and potentially to improved student outcomes.

The proposed experiment and data collection will be an important first step toward addressing this need for evidence-based guidance. The comprehensibility of school choice information is a particular concern for low-income parents and those who might struggle to navigate the technical systems used to display this information. The experiment will present parents with school information in various formats, amounts, and organizational layouts and allow us to evaluate how well parents understand and use the information.

a. Overview of study

Within the field of education, researchers have mostly focused on discovering what parents value in schools, rather than on how best to organize and present information on school attributes. The most common approach has been to conduct focus groups or surveys, asking participants about the factors that drive their school choices (Fossey 1992; Armor and Peiser 1998; Collins and Snell 2000; Klute 2012; Kelly and Scafidi 2013; Great Schools 2013; Jochim et al. 2014). However, this approach has been criticized for eliciting socially desirable responses and failing to capture the role of race, class, and other demographics (Stein et al. 2010). Other researchers have used Internet search terms (Schneider and Buckley 2002) or conducted statistical analyses of actual rankings submitted by families in real-world school choice settings (Glazerman 1997; Hastings et al. 2008; and Harris and Larsen 2015, using data from Minneapolis; Charlotte-Mecklenburg, NC; and New Orleans, respectively). These studies, which provide estimates on the relative importance of various school attributes to parents, are useful because they highlight the dimensions along which choice information might be more important, such as academic achievement of the school, demographics of the student body, distance and convenience of the school location; and school safety and climate.

Very little research has been done on how best to organize and present information about school attributes. Among the few studies available, Jacobsen et al. (2014) studied the effect of information formats on parents’ perceptions of schools. Other researchers have estimated the impact that information presentation has on school choice attitudes and behavior. For example, Valant (2014) used quick-turnaround online experiments and a regression discontinuity design to examine how parents update their opinions of local public schools after receiving various types of information. Valant and Loeb (2014) also conducted field experiments with families choosing schools in Milwaukee; Washington, DC; and Philadelphia to test how information affects school choosers’ attitudes, beliefs, and behaviors. Although directly relevant to the proposed study, Valant and Loeb’s experiments did not explore as many different types of information presentations as proposed here, and they focused on parents’ ratings of schools rather than on whether they actually understood the information and found it easy to use.

The study for which clearance is being sought will collect and analyze data to address three specific questions:

  1. What is the optimal way to present school choice information?

  2. What is the right amount of school choice information to present?

  3. How is school choice information best organized?

To answer these questions, Mathematica will conduct an online experiment with low-income parents of school-age children. The sample will mostly be drawn from a market research standing panel that is commercially available. To address concerns about the ability to generalize from such a panel, Mathematica will augment the panel with members of the public recruited from targeted locations within cities such as Washington, DC, and New Orleans, where low-income families are immersed in an open-enrollment school marketplace with many public schools to choose from, including traditional district schools and charter schools.

Both the panel sample and the augmented sample members will participate in a web-based survey that will collect basic demographic information and an endline survey measuring how well the participants understood, used, and perceived the ease of use of school choice information. After completing the baseline survey, respondents will each be randomly assigned to one of several different ways to present school choice information (treatment arms). Random assignment allows us to assume that differences in responses to the endline survey, on average, are attributable to differences in the ways information was presented to respondents.

b. Experiment and data collection

To generate most of the experimental data, Mathematica will work with a market research firm—the leading candidate is Survey Sampling International (SSI)—who will identify 3,300 parents of school-age children in the U.S. who are low income, defined as having an annual household income below $40,000. To check the sensitivity of the standing panel findings with a less internet savvy group, Mathematica will recruit 150 volunteers who are low-income parents of school-age children in low-income areas where school choice is particularly salient (the augmented sample). We plan to focus our recruiting efforts on Washington, DC, and New Orleans. However, we may consider other cities if necessary.

All eligible study participants will be asked to complete a 10-minute baseline survey. Then they will be randomly assigned to one of 72 different variations on a school information website and asked to complete a 20 to 30 minute endline survey, for a total of 30 to 40 minutes. The baseline will measure demographic characteristics, such as income and whether or not the respondents have school-aged children, as well as digital literacy. The endline will measure how respondents use the information, how well they understand the information, and how easy or difficult it was to use.

The treatments being studied consist of different ways to present information about a set of fictitious schools. The information will be presented in one of 72 different ways for each respondent, with the 72 variations being constructed by crossing five factors, each with two or three levels (3 x 2 x 2 x 3 x 2 = 72). Table A.1 lists each of the five factors and maps them to the study’s research questions. Table A.2 lists the information domains, the specific attributes that will be presented, and the presentation variations by format and source of information. For example, to address the question of presentation, we will test the format (factor A.1) in which safety information is presented and the source of that information. Safety information will be presented as a numerical rating, a graphical presentation, such as a bar chart, or an icon, such as a letter grade of “A” to indicate that the school has high safety ratings. Sources of the information will vary in that one source will be a more objective indicator, the percentage of students with no school suspensions, and another source would be a more subjective indicator, results from a parent survey on the school’s safety. The specific factors, domains, and attributes that make up the experiment (treatment arms) have been selected based on a review of research on information presentation across several fields, including health and marketing.

Table A.1. Factors to be tested, mapped to research questions

Research Question

Factor

Level 1

Level 2

Level 3

A. Presentation

1. Format

Numbers

Graphs

Icons

2. Source of information

Objective indicator

Both objective and subjective indicators

n.a.

B. Amount

1. Reference point

No reference point

District reference point

n.a.

2. Attributes and Disclosure

Low information (one attribute per domain)

High information, multiple attributes per domain all shown at once

Progressive disclosure: high information, with drawer closed by default

C. Organization

1. Sort

Default = distance

Default = academic rating

n.a.

n.a. = not applicable



Table A.2. Types of School Choice Information to Display

Domain

Attributes

Variations in format (graphics or icons to be shown in addition to numbers)

Variations in source (subjective indicator to be shown in addition to main attributes)

Distance

Straight-line distance from home to school

*Walking time

*Driving time

No variation

No variation

Academics

% proficient on 2016 achievement test

*% proficient on the 2016 math test

*% proficient on the 2016 reading test

*Average 2015-2016 academic growth, 0-100 index

*Average 2015-2016 academic growth in math

*Average 2015-2016 academic growth in reading

  • Graph: Horizontal bar graphic for each indicator; no additional text

  • Icon: Letter-grade icon with color coding (green indicating better grades); no additional text

Percentage of parents agreeing with statement that they are highly satisfied with the school’s academic quality

Safety

% of students with no suspensions

*Attendance rate

*Yes/no: School won a blue-ribbon award for anti-bullying efforts

  • Graph: Horizontal bar graphic for suspensions and attendance indicators; no additional text

  • Icon: Letter-grade icon with color coding for suspensions and attendance indicators; no additional text

Percentage of parents agreeing with statement that the school is a safe place for their child

Resources

Number of laptops or tablets per 100 students

*Year of most recent school renovation

*Yes/no on 4 items: school has dedicated art studio, library, computer lab, music program

No variation

No variation

* Attribute will only appear in the open drawer

A.2 Purposes and uses of data

The purpose of this study is to produce an evidence-based guide for school districts on how to present school choice information to parents of school-age children. The guide is a component of ED’s efforts to promote ongoing improvement in its school choice programs, such as the Magnet Schools Assistance Program (MSAP), the Public Charter School Program (PCSP), and the Opportunity Scholarship Program (OSP).

A.3 Improved information technology (reduction of burden)

Where feasible, available technology will be used to reduce burden and improve efficiency and accuracy. For example, all data collection will be done via a web-based instrument, and the in-person sample members will be provided laptops on which to complete their survey. This will reduce burden and make it easy for respondents to complete the survey in a timely manner. Because we are targeting a low-digital-literacy population for the in-person recruitment, we will have trained staff available on site to administer the web-based survey and will handle any technical complications or answer questions respondents may have about the technical functions of the instrument, increasing ease of completion.

A.4 Efforts to identify duplication

The PISCE study team will strive to minimize data duplication. The data collection will focus on gaps in the existing literature regarding best practices for presenting school choice information to parents. As reviewed in section A.1.a, this study is designed fill a gap in the existing literature by focusing how best to organize and present the information about school attributes. Further, to our knowledge, this information is not currently available from existing standing panel data or from other publically available data sources. The instruments are designed to ensure that we are not repeating questions that are already asked through the course of panel participation and will only include items that are necessary to the study.

A.5 Minimizing burden for small entities

No small entities will be involved in this study.

A.6 Consequences of not collecting data

The data collection plan described in this submission is necessary to study parents’ perception and use of school choice information and to help fill the gap in our understanding of the school choice process. If the study were not conducted at all, the Department of Education would not be able to provide any evidence-based recommendations to school districts and other entities overseeing school choice systems about how to present school information to parents.

A.7 Special circumstances

There are no special circumstances associated with this data collection. The proposed data collection is a one-time effort that does not require respondents to retain any records and which complies with applicable regulations.

A.8 Federal Register announcement

a. Federal Register announcement

A 60-day notice to solicit public comments was published in the Federal Register on January 19, 2016. To date, no comments have been received.

b. Consultations outside the agency

Experts consulted include members of the study team listed below.

Steven Glazerman, Mathematica Policy Research

(202) 484-4834

Jon Valant, Education Research Alliance for New Orleans

(504) 274-3617

Lisbeth Goble, Mathematica Policy Research

(312) 994-1016

Ira Nichols-Barrer, Mathematica Policy Research

(617) 674-8364

Jesse Chandler, Mathematica Policy Research

(734) 205-3088

Mariel Finucane, Mathematica Policy Research

(617) 715-6935


Additional experts outside of the study team (listed below) will provide guidance during the 60-day comment period, through the convening of a technical working group (TWG). This TWG will include practitioners with expertise on the study population itself as well as experts on information and behavioral science, school choice, and the experimental methods that are part of the study design.

Practitioners

Sujata Bhat, NewSchools Venture Fund

(202) 609-8150

Aesha Rasheed, New Orleans Parent Guide

(504) 684-4512

School Choice Content Experts

Paul Teske, University of Colorado

(303) 315-2805

Rebecca Jacobsen, Michigan State

(517) 353-1993

Experts in Methodology and Cognitive/Behavioral Science and Decision-Making

Robert J. Meyer, Wharton Business School

(215) 898-1826

Peter Bergman, Columbia Teachers College

(212) 678-3932

Ellen Peters, Ohio State University

(614) 688-3477

Judith Hibbard, University of Oregon

(541) 346-3364



A.9 Payments or gifts to respondents

For the validation sample that will be recruited in person, study is at particular risk for low participation because the data collection is focused on a specific type of respondent—low-income, low-digital-literacy parents of students—that is generally considered a hard-to-reach population (Bonevski et al. 2014). To address this concern, the study team will provide in-person participants with a monetary incentive of $30 to complete a 40 minute survey.2 The research team will provide payment in the form of a gift card after the respondent has completed the survey. The use of incentives to encourage participation has been shown to bolster participation rates and produce cost savings (Kulka 1995; Link et al. 2001; Martinez-Ebers 1997; Singer et al. 1999). Further, the proposed amounts are within the incentive guidelines outlined in the March 22, 2005 memo for a high burden parent survey, “Guidelines for Incentives for NCEE Evaluation Studies,” prepared for OMB.

A.10 Assurance of confidentiality

All information from this study will be kept confidential as required by the Education Sciences Reform Act of 2002 (Title I, Part E, Section 183). To ensure privacy, identification numbers will be used on the surveys rather than names. All of the information that is collected will be stored separately from school records in a secure location. Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. We will not provide information that identifies the respondent, the school, or the district to anyone outside the study team, except as required by law.

The survey will be completely anonymous. The study team will not collect or store any personally identifiable information (PII). The provider of the survey sample will have access to the identifying information for respondents to ensure that they meet study criteria, but will not have access to any survey data. Despite the lack of PII, the data will be stored on secure servers with access rights carefully restricted to users who need it based on the research plan and in accordance with the data security terms in the contract governing the work.

A.11 Sensitive questions

The survey will ask respondents about their income and whether they have school-age children. These items are necessary to screen out respondents who are not in the population of interest. The team will not ask participants for any other information of a sensitive nature. Survey questions about school choice behavior will center on fictional schools and hypothetical choices.

A.12 Estimates of burden

Table A.2 shows our estimated burden hours for the augmented sample, in-person data collection and the online panel data collection. The sample will consist of low-income parents of school-age children living. The parents will complete a 40-minute web-based data collection in person with Mathematica trained staff. The burden table is based on these expectations and on our assumption that we will recruit 3300 participants as part of the standing panel and 225 participants for the validation sample of which 150 will be eligible to complete the full experiment. We estimate the total burden to be 2,306 hours ($39,414).

Table A.3. Estimates of recruitment burden for parents of school-age children

Recruitment

Number contacted

Annual Number of response

Average burden time (minutes)

Total burden over entire study (hours)

Annual burden house

Respondents’ average hourly wage

Estimate of respondent labor cost

Annual Burden house

In-person sample (screen only)

75

37.5

5

6.3

3.15

$17.09a

$107

$53.50

In-person sample (full experiment)

150

75

40

100

50

$17.09a

$1,709

$854.50

Online panel sample

3300

1650

40

2200

1100

$17.09a

$37,598

$18,799

Total

3525

1762.5

39

2306

1153

$17.09a

$39,414

$19,707

a$17.09 an hour is the median wage in May 2014 across all occupations in the United States (http://www.bls.gov/oes/current/oes_nat.htm).

The annual responses are 1175; the annual burden hours are 769.


A.13 Estimates of Cost Burden to Respondents

There are no start-up costs for respondents.

A.14 Annualized Cost to Federal Government

The cost of the study is $1,194,337; the estimated average annual cost of the study over two years is $597,169.

A.15 Program changes or adjustments

This is a new information collection.

A.16 Plans for tabulation and publication

We will answer the three research questions mentioned previously by estimating the effect of various factors—those related to presentation mode, amount of information, and organization of the information, respectively. Our analytical approach allows us to efficiently test a large number of factor combinations and have assumed an experiment that includes one factor with two levels, two with three levels, and one with four levels, for a total of 72 factor combinations: 2 x 3 x 3 x 4 = 72. Study participants could each be randomly assigned to one of these 72 configurations. In Table A.3, we simplify the design to a three-factor experiment, with each factor having two levels (2 x 2 x 3), and include examples of factor levels we might examine. For example, a parent may be randomly assigned to receive school information presented with one indicator per domain in a numerical form with schools sorted by distance (A1B1C1). Another parent may also receive school information presented with one indicator per domain in graphical form, but in this case, the schools will be sorted by academic rating (A1B1C2).



Table A.4 Framework for testing factor combinations and contrasts in an illustrative 2 x 2 x 2 design

Factor and factor level

A (presentation) = 1
number

A (presentation) = 2
graph or icon

B (Amount) = 1
One indicator/domain

B (Amount) = 2
Two indicators/domain

B (Amount) = 1
One indicator/domain

B (Amount) = 2
Two indicators/domain

C (Sort) = 1
distance

A1B1C1

A1B2C1

A2B1C1

A2B2C1

C (Sort) = 2
academic rating

A1B1C2

A1B2C2

A2B2C2

A2B2C2

Note: Contrasts include main effects: impact of factor A, B, and C; and interaction effects: AB, BC, and AC.

We will answer the first research question by estimating the main effect that contrasts Factor A set to 1 (first two columns in the table) versus Factor A set to 2 (last two columns in the table), averaging the overall values of Factors B and C. In other words, will a graphic be more usable and understandable than a numeric rating, all things being equal in terms of the number of indicators reported per domain and organization of the information? We will answer the second and third questions by estimating the main effect contrasts for Factors B and C in the same way, respectively. We will also consider interaction effects in a separate analysis, which will enable us to answer more nuanced questions such as, “Does the optimal way to present school choice information differ when there is a larger versus a smaller amount of information provided in each domain?” This question corresponds to the given interaction of factors by comparing the difference in outcomes between columns 1 and 3 to the difference in outcomes between columns 2 and 4. This approach to deriving the contrasts will be extended to a more general and ambitious set of detailed questions by specifying more factor levels within A, B, and C.

A fourth factor will represent categories of school choice information. We identified four categories of information that are meaningful to parents when choosing schools: (1) academic achievement of the school; (2) school discipline and safety; (3) distance and convenience of the school location; and (4) school resources. We will analyze the interactions between these types of school information categories and the presentation, organization, and amount of information. This will allow us to understand the optimal ways to present each type of school information that is meaningful to parents.

The augmented sample will be used to examine the extent to which results can be replicated when we include sample members who are more experienced with school choice but less experienced with online surveys. If findings from the experiment with the augmented sample differ from those generated by the standing panel alone, we will include cautionary language in the guide and will use that information to inform future research, including recommendations regarding the value of a field trial.

The study team will analyze data from the experiment and produce tables and bullets on key findings that will be delivered in the form of a district guide. This guide will be delivered in March 2017.

A.17 Display OMB expiration date

The OMB expiration date will be displayed on all recruitment and data collection materials.

A.18 Exceptions to certification statement

No exceptions to the certification statement are requested or required.



References

Armor, David J., and Brett M. Peiser. “Interdistrict Choice in Massachusetts.” In Learning from School Choice, edited by Paul E. Peterson and Bryan C. Hassel. Washington, DC: Brookings Institution Press, 1998.

Bonevski, B., M. Randell, C. Paul, K. Chapman, L. Twyman, J. Bryant, I. Brozek, and C. Hughes. “Reaching the Hard-to-Reach: A Systematic Review of Strategies for Improving Health and Medical Research with Socially Disadvantaged Groups.” BMC Medical Research Methodology, 2014, vol. 14, no. 42Collins, Alan, and Martin Snell. “Parental Preferences and Choice of School.” Applied Economics, vol. 32, no. 7, 2000, pp. 803–813.

Fossey, Richard. “Open Enrollment in Massachusetts: Why Families Choose.” Educational Evaluation and Policy Analysis, vol. 16, no. 3, September 1994, pp. 320–334.

Glazerman, Steven. “Determinants and Consequences of Parental School Choice.” Unpublished working paper, University of Chicago, Harris School of Public Policy, December 21, 1997.

Great Schools. “How Do Parents Research and Choose Schools? Parent Attitudes and Behaviors When Choosing Schools, 2013.” Available at http://www.greatschools.org/catalog/pdf/How_Do_Parents_Research_and_Choose_Schools.pdf.

Harris, Douglas N., and Matthew Larsen. “What Schools Do Families Want and Why?” Technical Report. New Orleans, LA: Education Research Alliance for New Orleans, January 15, 2015.

Jacobsen, Rebecca, Jeffrey Snyder, and Andrew Saultz. “Information or Shaping Public Opinion? The Influence of School Accountability Data Format on Public Perceptions of School Quality.” American Journal of Education, vol. 121, no. 1, November 2014, pp. 1–27.

Jochim, Ashley, Michael DeArmond, Betheny Gross, and Robin Lake “How Parents Experience Public School Choice.” Making School Choice Work Series. Seattle, WA: Center for Reinventing Public Education, December 2014.

Kelly, James, and Benjamin Scafidi. “More Than Scores: An Analysis of Why and How Parents Choose Private Schools.” Indianapolis, IN: The Friedman Foundation for Educational Choice, November 2013.

Klute, Mary. “Understanding How Parents Choose Schools: An Analysis of Denver’s School Choice Form Questions.” Denver, CO: Buechner Institute for Governance, December 12, 2012.

Kulka, R. (1995) "The Use of Incentives to Survey 'Hard-to-Reach' Respondents: A Brief Review of Empirical Research and Current Research Practice." Seminar on New Directions in Statistical Methodology, Federal Committee on Statistical Methodology. Statistical Policy Working Paper 23. pp. 256-287.

Link, MW, Malizo, A.G., and Curtin, T.R. (2001). “Use of targeted monetary incentives to reduce nonresponse in longitudinal surveys.” Paper presented at the annual conference of the American Association of Public Opinion Research, Montreal, Quebec Canada.

Martinez-Ebers V. “Using Monetary Incentives with Hard-to-Reach Populations in Panel Surveys.” International Journal of Public Opinion Research, vol. 9, 1997, pp. 77–86.

Schneider, Mark, and Jack Buckley. “What Do Parents Want from Schools? Evidence from the Internet.” Educational Evaluation and Policy Analysis, vol. 24, 2002, pp. 133–144.

Singer E, Van Hoewyk J, Gebler N, Raghunathan T, McGonagle K. “The effect of incentives on response rates in interviewer-mediated surveys.” Journal of Official Statistics, vol. 15, no. 2, 1999, pp. 217–230.

Stein, Marc, Ellen Goldring, and Xiu Cravens. “Choosing Indianapolis Charter Schools: Espoused Versus Revealed Academic Preferences.” Prepared for School Choice and School Improvement: Research in State, District and Community Contexts, Vanderbilt University, August 2010.

Valant, Jon. “Better Data, Better Decisions: Informing School Choosers to Improve Education Markets.” Washington, DC: American Enterprise Institute, November 2014.

Valant, Jon, and Susanna Loeb. “Information, Choice, and Decision-Making: Field Experiments with Adult and Student School Choosers.” Working paper. 2014.



1 ED is also interested in the effects of providing better information on actual school choices and student outcomes. The Department will consider a field trial of strategies to disseminate school choice information after the current study has narrowed down the way information is best presented.

2 The panel participants are compensated as part of their on-going participation in the panel.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePCI OMB Supporting Statement-Part A
SubjectPISCE Request for OMB Clearance OMB #
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy