Amendment to PEELS.Part A. 12.21.06.OMB Supporting Statement1

Amendment to PEELS.Part A. 12.21.06.OMB Supporting Statement1.doc

Pre-Elementary Education Longitudinal Study (PEELS) (SC)

OMB: 1850-0809

Document [doc]
Download: doc | pdf

Supporting Statement



INTRODUCTION



The Pre-Elementary Education Longitudinal Study (PEELS), funded by the U.S. Department of Education, was designed to examine the preschool and early elementary school experiences of children with disabilities and their performance over time. The study will follow a nationally representative sample of children through 2009. Five broad descriptive research questions guide the data collection, analysis, and reporting for this multiyear study.


  • What are the characteristics of children receiving preschool special education?

  • What preschool programs and services do they receive?

  • What are their transitions like—between early intervention and preschool and between preschool and elementary school?

  • How do these children function and perform in preschool, kindergarten, and early elementary school?

  • Which child, service, and program characteristics are associated with children's performance over time on assessments of academic and adaptive skills?


Clearance for Waves 3 and 4 of PEELS was provided in early 2006. This submission requests several changes to the design of the Wave 4 data collection. The first change is elimination of two data collection instruments for Wave 4, the Early Childhood Program Director Questionnaire and the Elementary School Principal Questionnaire, based on concerns about response rates and the quality of the data. During the PEELS Wave 1 data collection in spring 2004, the contractor for the National Center for Special Education Research (NCSER), Westat, sent questionnaires to the principals and program directors of the schools that the PEELS children attended. Despite our best efforts, only 40 percent of the principals and program directors returned completed questionnaires. In spring 2005, Westat contacted these organizations again and was able to increase the overall response rate for the Wave 1 organizations to 76 percent. In spring 2005, we also sent questionnaires to the principals and program directors of schools into which PEELS children had moved. The preliminary response rate for the Wave 2 principals and program directors was 65 percent. We reopened the field period for those questionnaires again in spring 2006 to bolster response rates.


During the two data collection periods, Westat identified concerns with the quality of responses from the principal and program director questionnaires. During the editing process, coders frequently found inconsistencies between items. For example, respondents would report the number of students in each racial/ethnic category, but the sum of this breakdown would differ from the total number of students reported in a previous item. A similar inconsistency was found when respondents reported the disability breakdown for their students. Respondents sometimes reported the total number of children receiving special education services rather than the total number of students enrolled in their school. In other instances, respondents did not include prekindergarten enrollment numbers in their totals. In both Waves 1 and 2, nearly 20 percent of the questionnaires required additional data retrieval to clarify inconsistencies or to obtain missing data.


NCSER is concerned about the response rates for these questionnaires. While Westat improved the response rate for the Wave 1 organizations by reopening the field period in spring 2005, the response rate was still less than 80 percent and less than the other data collection instruments in PEELS. In conducting follow-up activities with nonrespondents, we found that frequently our mailings to principals and program directors were discarded unopened. The level of effort required to obtain a completed principal questionnaire is nearly twice that required to obtain a completed teacher questionnaire.


One possible way to address this ongoing problem is to use data from the QED and discontinue the principal/program director questionnaires after Wave 3. While the QED includes only limited items from the principal/program director questionnaires, the data may be more reliable than what is obtained through the questionnaires. The 2004 QED School file contains over 100,000 public and private schools. In addition, QED sells an Early Childhood data file, which contains over 100,000 child care centers, including preschools, Head Start Centers, and Montessori schools. Using address and location information, Westat could link schools and programs in PEELS to the data in the QED files.


NCSER believes the QED to be more reliable given that the QED is updated regularly, and some items, including total school enrollment, are phone verified to ensure data accuracy. While responses to the subjective items in the questionnaires are not available in the QED, this data source does provide data for many of the demographic items. Tables A-1 and A-2 in Attachment A present a comparison between the PEELS questionnaire items and the data available in the QED.


This submission also includes revisions to the Wave 4 data collection, including a shortened version of the parent interview, replacements for several of the direct assessment instruments, and the elimination of 59 items from the teacher questionnaires that were previously part of the Vineland Motor Skills scales and Social Skills Rating System. From the outset, the PEELS design called for a shortened parent interview in Wave 4. Attachment B includes a list of the items we propose retaining. They are considered most critical to data analysis or drive skip patterns in the instrument and are therefore critical to the flow of the interview. The shortened instrument is reflected in burden estimates for Wave 4.


In Wave 4, the oldest of the PEELS children will be eight-years-old. Some of the subtests used in Waves 1, 2, and 3 will no longer be appropriate for them, and new tests will be required to capture their emerging academic skills. Attachment C includes a list of the proposed assessments. They are all widely used off-the-shelf assessments, selected in consultation with the PEELS Advisory Panel. The changes in assessments are also reflected in the burden estimates for Wave 4.


We propose eliminating two sets of items from the Early Childhood Teacher, Kindergarten Teacher, and Elementary School Teacher questionnaires. In Waves 1-3, the teacher questionnaires included the Motor Skills scale from the Vineland Adaptive Behavior Scales. The Motor Skills scale was normed for children up to the age of six. The developers found that, by age six, motor development had reached its peak, and they saw a lack of variability in the raw scores beyond this age. Since about two-thirds of the children in the study in wave 4 will be older than six, we recommend removing this scale.


The Social Skills scale of the Social Skills Rating Scale uses two types of ratings. The respondents first rate how often the child exhibits a particular behavior, then indicate how important the behavior is for success in the classroom. When creating the scores for this measure for Wave 3, we learned that the importance ratings are not used in calculating the actual scale score. The importance ratings are only available to help identify behaviors in need of change and guide the development of intervention plans. Because that is not within the PEELS analysis plan, we propose eliminating the importance ratings from the Social Skills scale. The shortened questionnaires are reflected in the burden estimates for Wave 4.


The original PEELS design specified that Cohort B would be excluded from the Wave 4 data collection, because cross-sectional data for this age cohort was not considered critical at age 7. NCSER would like to include all participating children in Wave 4 to enhance analysis capabilities. Modeling growth in achievement is one of the key dimensions of the project, and those models become increasingly reliable with more data points.


Attachment D includes a data collection schedule for PEELS, which OMB requested in early 2006 following the previous clearance process.


A. JUSTIFICATION


1. Authority


Responsibility for PEELS was transferred from the Office of Special Education Programs (OSEP) to NCSER of the Institute of Education Sciences (IES). NCSER supports a comprehensive research program to promote the highest quality and rigor in research on special education and related services and to address the full range of issues facing children with disabilities, parents of children with disabilities, school personnel, and others. The authorization for NCSER occurred on December 3, 2004, with the President's signing of the reauthorization of the Individuals with Disabilities Education Act (IDEA). NCSER/IES continues to support PEELS as part of the comprehensive research program designed to meet the mandated reporting requirements of IDEA as described in the 2004 Supporting Statement. The authorization for this collection of data is found in Public Law 105-17, Section 674 (20 U.S.C. 1474) which permits the conduct of studies to measure and evaluate the impact of the Individuals with Disabilities Education Act (IDEA) and the effectiveness of States’ efforts to provide a free, appropriate public education to all children with disabilities.


2. Use of Information


NCSER/IES has a variety of ongoing needs for information about the implementation and outcomes of special education for children ages 3-5 with disabilities across the nation. These include:

  • Information requested by Congress in regular reauthorizations of IDEA.


  • Data that serve as indicators of Government Performance and Results Act (GPRA) objectives. In particular, PEELS addresses IDEA, Part B, Indicator 1.2, which states, “The percentage of preschool children receiving special education and related services who have readiness skills when they reach kindergarten will increase.” The primary data source on children’s early literacy and early numeric skills is the PEELS direct assessment. Direct assessments of (pre-)reading and (early) mathematics skills are conducted in each wave of PEELS. The final preschool assessment can be used to gauge academic readiness for kindergarten a few months later.

  • Information to respond to the many questions about children with disabilities, their families, and the programs that serve them that are raised by policymakers, advocates, practitioners, parents, and researchers.


Data collected from PEELS will supply much-needed information for all of these purposes. Specifically, the following groups of individuals are likely to benefit from collection of the information:

  • Federal policymakers, who make decisions about special education and related services for young children with disabilities and the critical interfaces among these programs and other federally funded services and systems that affect children with disabilities and their families.

  • State early childhood special education policymakers (e.g., 619 coordinators) who make decisions regarding state implementation of special education, state funding levels for special education, and other issues about programs and services for children with disabilities.

  • LEA and school administrators, who are responsible for implementing programs and services at the local level.

  • Practitioners and administrators in early childhood special education and related service systems, who will better understand the participation of young children with disabilities in those systems and the contribution of services to achievement.

  • Parents of children with disabilities who can use information on special education and related services and achievement to increase their own capacity to advocate effectively for their children.

  • Higher education faculty who conduct preservice training of special education teachers and related service personnel, who can use information on service and program characteristics that facilitate positive outcomes for children to improve the capabilities of future educators and practitioners.

  • Researchers who have access to this rich data source to conduct a variety of secondary analyses, develop comparable local or statewide follow-up studies, review the technical methods, or use the data for publication.


3. Use of Technology


PEELS uses computer-assisted-telephone interviewing (CATI) to conduct all the parent interviews, which results in a more efficient interview for the respondent and more efficient data cleaning. PEELS makes maximum use of e-mail when corresponding with district contacts and respondents willing to provide an e-mail address. E-mail communications are less burdensome and less intrusive for respondents and are a cost effective means of communicating. District Site Coordinators are encouraged to use the secure PEELS fax number when updating their Child Status Report (CSR). Fax is also used for data retrieval when asking teachers and principals or program directors to review and/or update questionable responses to mail surveys. PEELS maintains a public website where respondents and other interested persons can learn about PEELS and contact PEELS staff directly. PEELS also maintains a restricted use website for Site Coordinators. This website contains training materials and frequently asked questions for Site Coordinators.


4. Avoidance of Duplication


PEELS Wave 4 will use four of the same Woodcock-Johnson III subtests - Letter-Word, Applied Problems, Calculation and Passage Comprehension, for 8-year-olds as same-age students in the Special Education Elementary Longitudinal Survey (SEELS). The purpose of this duplication is to provide an analytic continuum of performance and achievement measures across the age groups of the two studies. In the previous submission, three partial scales included in the SEELS student interview were proposed for inclusion in PEELS. These three scales are no longer being considered for PEELS because of concerns about the length of the assessment and because in SEELS they were not found to be highly correlated with other key measures of interest.


5. Small Business Impact


No small businesses will be involved as respondents in this data collection. Therefore, there will be no small business impacts.



6. Consequences of Not Collecting Information


In the absence of the data collection for PEELS, Federal policy regarding early childhood special education and related services will continue to be made without a solid base of information from which to address such fundamental questions as the nature of the children served, the instructional programs and services they receive, and the achievements of children receiving early childhood special education and related services. Questions raised in the context of recent Federal reauthorizations for which data were unavailable will continue to be raised, again without satisfactory responses.


The timing and frequency of data collection for PEELS are rooted in the nature of both the PEELS population and the nature of the early childhood programs they attend. Developmentally, the children in PEELS change at a more rapid rate than the children in SEELS or NLTS2. Because preschool is not governed by traditional American compulsory education, the early childhood programs that the children in PEELS attend differ dramatically from each other and from the more standard formal school system that characterizes elementary and secondary schools. As a result, it is necessary to conduct data collections immediately and repeatedly to capture these vast differences and rapid changes. The schedule of data collection is considered the minimum number and maximum spacing to obtain accurate information on children’s outcomes. Data collection on school-based programs is timed to permit appropriate analytic linkages to children’s elementary school outcomes.


7. Special Circumstances


The proposed data collection is consistent with 5CFR 1320.6 and therefore involves no special circumstances.


8. Consultation Outside the Agency


Study design work was conducted by SRI International, and Westat was contracted to conduct data collection, data cleaning, analysis, and reporting. The design phase involved extensive input from experts in the content areas and methods used by PEELS. First, a stakeholder advisory panel was used that included representatives from many of the audiences that will be keenly interested in PEELS. The panel helped develop the conceptual framework and define and prioritize the research questions. The group met once in person for a day-long meeting and engaged in a priority-setting exercise for the research questions through an exchange of materials and a voting process.


Second, a technical work group (TWG) of researchers experienced in child-based and longitudinal studies, early childhood education, and special education advised on multiple aspects of the design, including the child sampling approach and data collection procedures. TWG members also received all the data collection instruments. The TWG held six phone conferences, and members reviewed all materials produced in the design process. Each member supplied PEELS staff with written comments and notes and provided verbal feedback through telephone conferences.


In addition, four nationally recognized experts in early childhood special education served as consultants to the PEELS design process. They provided advice in all areas, with particular attention to the data collection instruments and administration timeframe. Four additional consultants provided advice on the selection of assessment instruments.


In the data collection/analysis phase, new technical review and stakeholder panels were secured to provide expertise on study design, data analysis, and data interpretation (see Exhibit 1.) Several members served on both the design panels and data collection/analysis panels. On August 23, 2005, the current technical review panel was convened via teleconference to consider issues related to the test of Early Math Skills. Member of the review panel also participated in conference calls held on June 26 and 27, 2006 to provide input on the selection of Wave 4 assessments.



Exhibit 1. Technical Consultant/Report Review Panel


Sally M. Atkins-Burnett

University of Toledo

3140 Snyder Memorial

Toledo, OH 43606-3390

Work: (419) 530-4307

Email: [email protected]


Peg Burchinal

Frank Porter Graham Institute

University of North Carolina

Campus Box 8185
521 South Greensboro Street
Carrboro, NC 27510

Work: (919) 966-5059

Email: [email protected]


Stephen Elliott

Vanderbilt University

Peabody #328

230 Appleton Place

Nashville, TN 37203

Work: (615) 322-2538

Email: [email protected]


Sam Odom

Indiana University School of Education

201 North Rose Avenue
Bloomington, IN 47405-1006
Work: (812) 856-8174

Email: [email protected]

Mabel Rice

University of Kansas

1082 Dole Center

Lawrence, KS 66045

Work: (785) 864-4570

Email: [email protected]

Beth Rous

University of Kentucky

Human Development

330 Mineral Industries Building 0051

Lexington KY 40506

Work: (859) 257-9115

Email: [email protected]

Rosa Milagros Santos

University of Illinois

Special Education

284D Education Building

1310 S. 6th Street, MC 708

Champaign, IL 61820

Work: (217) 333-0260

Email: [email protected]

Patricia Snyder

Professor of Pediatrics

Center for Child Development

Vanderbilt University Medical Center

415 Medical Center South

2100 Pierce Ave.

Nashville, TN 37232-3578

Work: (615) 936-6739

Email: [email protected]

Patricia Addison

Fairfax County Public Schools

Belle Willard Administrative Center
10310 Layton Hall Drive
Fairfax, Virginia 22030
Work: (703) 246-7780

Email: [email protected]


Lynn Busenbark

Arizona Department of Education

1535 West Jefferson Bin 24

Phoenix, AZ 85007

Work: (602) 542-4013

Email: [email protected]

Anne Devaney

Baltimore City Public Schools

Armistead Gardens Elementary School

5001 East Eager Street

Baltimore, MD 21205

Work: (410) 396-9090




9. Reimbursement of Respondents


Table 1 outlines a revised incentive structure for PEELS participants. It removes the cost of incentives for the Elementary School Principal and Early Childhood Program Director Questionnaires, which were $20 gift certificates to Amazon.com.


Table 1. Revised Incentive Structure for PEELS Participants—Waves 3 and 4

Data collection

Incentive

Administration procedures

Parent CATI

$20 check

Enclosed with advance letter

Early childhood teacher/kindergarten teacher/elementary school teacher questionnaire

$10 cash

Included with questionnaire

Direct child assessment

$1 toy

Provided at time of assessment

Direct child assessment

$15 gift certificate

Given at the time of assessment to families who allowed assessments to be conducted in their homes or who transported children to another location for assessment

Child Status Report

$10 - $30 check

Sent to Site Coordinators upon receipt of completed CSR


10. Assurances of Confidentiality


PEELS respondents are assured that confidentiality will be maintained, except as required by law. The design of the study addresses state and local concerns regarding the Family Educational Rights and Privacy Act (FERPA) and operates in accordance with the Privacy Act of 1974, as amended, (5 U.S.C. 552a). Specific steps to guarantee confidentiality include the following:



  • Information gleaned from rosters (e.g., respondent name, address, and telephone number) is not entered into the analysis data file, but is kept separate from other data and is password protected. A unique identification number for each respondent is used for building raw data and analysis files.

  • Information that can be used to identify an individual, including name, contact information, school name, or unique identifier, will not be included in data files provided to the public.

  • In public reports, findings are presented in aggregate by type of respondent (e.g., parents' perceptions of service delivery) or for subgroups of interest (e.g., academic performance of students with learning disabilities). No reports identify individual respondents, local programs, or schools.

  • Access to the student sample files is limited to authorized study staff only; no others are authorized such access.

  • All members of the study team are briefed regarding confidentiality of the data. Each person involved in the study on all participating research teams is required to sign a written statement attesting to his/her understanding of the significance of the confidentiality requirement. Those affidavits of nondisclosure are on file.

  • A control system is in place, which began at sample selection, to monitor the status and whereabouts of all data collection instruments during transfer, processing, coding, and data entry.

  • All data are stored in secure areas accessible only to authorized staff members. Computer-generated output containing identifiable information is maintained under the same conditions.

  • As approved by the IES Disclosure Review Board (DRB), PEELS micro-level data will be released only through a restricted-use data set and a data analysis system (DAS). The restricted-use data set (which will also run behind the DAS) will undergo data swapping to protect the confidentiality of respondents.


11. Sensitive Items


There are no questions of a sensitive nature included in any of the data collections. Parents/guardians were asked to respond concerning their experiences with special education and other education programs and special services, nonschool experiences, their demographic characteristics, and the abilities of their children. Parents/guardians were informed that they could decline to answer any item during the telephone interview. Administrators and teachers were asked to report on specific activities, programs, and services for sample children, children’s classroom experiences, and their own demographic characteristics.



12. Estimates of Burden


This request pertains only to burden for Wave 4. The burden estimates are lower than previously estimated due to standardized tests being removed from the estimates. Prior burden estimates reflected the administration of standardized tests. Exhibit 2 has been revised to reflect the deletion of the Early Childhood Program Director and Elementary School Principal Questionnaires for Wave 4, which previously totaled 20 minutes per completed questionnaire for 83 questionnaires (1,660 minutes). It also reflects the request to include Cohort B in Wave 4. Cohort B had previously been omitted from the Wave 4 data collection. In addition, it reflects the proposed revisions to the teacher questionnaires for Wave 4 (i.e., eliminating the Vineland Motor Skills scale and the importance ratings for the SSRS). In Waves 1-3, the burden estimate for these questionnaires was 30 minutes. By removing the proposed items, we estimate that they will take 23 minutes to complete. Finally, the burden estimate reflects the modification to the parent interview for Wave 4, which is expected to average 15 minutes, down from 60 minutes in Waves 1-3. While this package specifies components of a new Wave 4 assessment, this does not change the burden, because the assessment used in Waves 1-3 and the assessment proposed for Wave 4 have the same length, 45 minutes.


13. Estimated Annual Cost Burden to Respondents


Respondent costs result from the investment of time in completing questionnaires, (e.g., school staff completing mail questionnaires, families responding to telephone interviews). Estimates of response time for each data collection instrument are presented in Exhibit 2 in response to item 12 above. No dollar costs have been associated with the time estimates because salaries of school personnel vary widely, and no standard valuation of parent time is available.


14. Estimated Annual Cost Burden to the Federal Government


The final costs for Wave 2 were 1,379,426. The final costs for Wave 3 were 1,583,076. We continue to estimate that costs for Wave 4 will be slightly lower than those for Wave 3.


15. Program Changes in Cost Burden


The estimated number of completed administrations for Wave 4, as reported in Item 13 of Form OMB 83-I, is higher than previously estimated for the following reasons:



  • Response rates were higher than anticipated in Waves 2 for all modes of data collection. For planning purposes, we continue to assume a 5 percent annual attrition rate for future waves based on experience from other national studies. However, this is now applied to a larger denominator.

  • There were plans to exclude children in cohort B in Wave 4, which was reflected in previous burden estimates. However, NCSER would like to include all children in Wave 4 to enhance analysis capabilities.


  • A shortened version of the parent interview will reduce the length of that data collection from 60 minutes to 15 minutes.


  • Eliminating 59 items from the Early Childhood Teacher, Kindergarten Teacher and Elementary School Teacher questionnaires will reduce the estimated time needed to complete the questionnaires from 30 minutes to 23 minutes.


Item 14 of the OMB Form 83-I does not include program changes in what was previously reported.


16. Plans/Schedules for Tabulation and Publication


Similar descriptive analyses, reports, and publications are proposed for each wave of PEELS data collection. A revised version of Table 2 is provided that contains planned completion dates for the Wave 3 and 4 reports.


Beginning with Wave 2 data, we will use differences in achievement scores to examine the relationship between predictors and performance for children with disabilities in terms of change scores. When data are available for three or more waves, the longitudinal dimension will add another level to Hierarchical Linear Modeling (HLM), and it will become a three-level model with longitudinal data points at the lowest level nested to the children and the children nested to districts. HLM will allow us to address change within children and between children simultaneously (Singer and Willett, 2003). The child’s growth curve will be estimated as an individual trajectory over the waves.


Beginning with Wave 2, Westat also proposes to begin a series of analyses to explore which children leave special education (through declassification) and which children change disability categories (through reclassification). On an annual basis, we will be able to describe the number of children who received special education services, the number who left during the year, and the number of children whose declassification status or timing was unknown. In addition, we will describe the proportion of children who left in any given year and the proportion remaining.




Exhibit 2. Revised Estimates of Waves 3 and 4 Respondent Burden

Instrument

Respondent

Actual number completed in Wave 1

Actual number completed in Wave 2

Anticipated number completed in Wave 3

Anticipated number completed in Wave 4

Minutes
per completion

Waves 3 & 4 burden in minutes



(a)

(b)

(c)

(d)

(e)

(c+d) x e

Family/Parent Interview

Parents and guardians

2,802

2,893

2,748

2,611

54 for Wave 3

15 for Wave 4

187,557

Teacher Q (Early Childhood, Kindergarten, and Elementary)

Teachers

2,180

2,381

2,262

2,149

23

101,453

Child Status Report

Site Coordinators

----

205

223

223

30

13,380

PEELS Direct Assessment

Participating children

2,437

2,704

2,569

2,440

45

225,405

PEELS Alternate Assessment

Teachers

355

228

217

206

15

6,345

TOTAL BURDEN







534,140


Notes: An additional 15 districts and 198 children were added to the study in Wave 2.

To calculate the annual reporting and recordkeeping burden on the 83-I form, we used an average of the anticipated numbers for Waves 3 and 4.

Table 2. Revised Schedule of Wave 4 Reporting Activities

Task

Estimated completion date

Comments/outlines

Final Wave Methods Reports

12/15/06

12/15/07

Proposed Title: Study Methods From PEELS Wave 3/4

Including the following chapters:

  • Study Design

  • Sampling

  • Instrumentation

  • Data Collection

  • Data Preparation (including weighting and imputation)

  • Data Analysis

Appendices including:

  • Instruments

  • Sampling Allocation

Final Report of CATI Data

4/14/07

4/14/08

Proposed Title: Parents’ Perspectives on their Young Children with Disabilities and the Services They Receive

Including the following chapters:

  • Background Characteristics

  • Health and Disability

  • Child Behavior

  • (Pre)School Programs and Services

  • Special Education Services

  • Child Care

  • Out-of-School Activities

  • Summary and Implications

Final Report of Questionnaire Data

4/14/07

4/14/08

Proposed Title: Teachers’ and Administrators’ Perspectives on Young Children with Disabilities and the Services They Receive

Including the following chapters:

  • Child/Family Characteristics

  • School and Program Characteristics

  • Classroom Characteristics

  • Special Education Services

  • Teacher Characteristics

  • State and Local Policies

  • Children’s Outcomes

Summary and Implications

Final Report of Assessment Data

4/14/07

4/14/09

Proposed Title: Assessment Results for Young Children with Disabilities

The report will be organized by subtest or scale. Results from the indirect assessments will also be included.

Briefing Booklet/Slides

4/15/07

4/15/08

Including a sample of slides used in various conference presentations for NCSER to use on an as-needed basis.

Thematic Reports

2/20/07

2/20/08

Westat will prepare a series of thematic reports that focus on a specific topic of interest to the field. These will be suitable for submission to a refereed journal.



17. Approval for Omission of Expiration Date


Not applicable.


18. Exceptions


No exceptions are taken.


11


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
Authorcelia.rosenquist
Last Modified Bysheila.carey
File Modified2006-12-21
File Created2006-12-21

© 2024 OMB.report | Privacy Policy