0915_Supporting Statement_B The Real Cost Campaign Outcomes Evaluation Study_10-2024

0915_Supporting Statement_B The Real Cost Campaign Outcomes Evaluation Study_10-2024.docx

The Real Cost Campaign Outcomes Evaluation Study: Cohort 3 (Outcomes Study)

OMB: 0910-0915

Document [docx]
Download: docx | pdf


United States Food and Drug Administration

Center for Tobacco Products


The Real Cost Campaign Outcomes Evaluation Study: Cohort 3 (Outcomes Study)


OMB Control Number 0910-0915


Supporting Statement Part B

B. Statistical Methods


  1. Respondent Universe and Sampling Methods


The Outcomes Study consists of a probability sample involving a longitudinal survey of approximately 6,000 youth and a supplemental sample of 1,500 youth, assessed at baseline and five follow-up waves to evaluate The Real Cost health education campaign. This longitudinal design allows us to calculate baseline-to-follow-up changes in campaign-targeted outcomes for each study participant. We hypothesize that if the campaign is effective, the baseline-to-follow-up changes in outcomes should be larger among individuals who report greater exposure to the campaign (i.e., dose-response effects). Eligible youth are aged 11 to 20 at baseline will be 14 to 24 by the end of data collection, including respondents recruited at subsequent follow-up waves. For the Outcomes Study, age is the only screening criterion. The survey is being conducted by RTI International (RTI).


The main data collection of the Outcomes Study will survey approximately 6,000 youth ages 11-17 at baseline. We will recruit a replenishment sample of approximately 2,160 youth ages 11-20 at follow-up 2 and 4 resulting in a total of 6,000 completed surveys at follow-up 2 and follow-up 4 (main data collection + replenishment sample). Additionally, at baseline, we will recruit a supplemental sample of approximately 1,500 youth ages 14-20 who identify as LGBTQ+ or have a mental health disorder.

The main data collection and replenishment sample of the Outcomes Study will be recruited from a stratified random sample of addresses selected from RTI’s national Address-Based sampling (ABS) frame. Our ABS frame is maintained in-house and based on the U.S. Postal Service’s Computerized Delivery Sequence (CDS) file, of which we receive monthly updates, so it is as up to date as possible. Prior to selecting the address sample, the national frame will be stratified by both an address’ predicted probability of having youth ages 11-17 and an address’ probability of response. Sample will be allocated to these strata to balance data collection costs and statistical precision.

The supplemental sample will be a convenience sample recruited online through social media. We will place advertisements on social media websites. Respondents who click on the ads will be directed to the screening instrument and then to the main assent or consent form if they qualify for the study. Eligible youth will be aged 14 to 20 and identify as LGBTQ+ and/or have a mental health disorder.

Exhibit 1. Addresses and the Associated Assumptions to Yield the Needed Number of Completes

Activity

National Sample

(All Youth)

Selected addresses

330,000

Occupied housing units

316,800 (96%)

Screened households

66,000 (20% of selected addresses)

Households with eligible youth

16,500 (25%)

Eligible youth responses (Baseline completes)

6,000 (36%)

Wave 2 (1st follow-up) completes

4,800 (80%)

Wave 3 (2nd follow-up) total completes

6,000 (3,840 + 2,160)

Wave 3 (2nd follow-up) completes

3,840 (80% of W2)

Wave 3 (2nd follow-up) replenishment addresses

145,000

Wave 3 (2nd follow-up) replenishment screened households

72,500 (50%)

Wave 3 (2nd follow-up) replenishment completes

2,160 (3%)

Wave 4 (3rd follow-up) completes

4,800 (80%)

Wave 5 (4th follow-up) total completes

6,000 (3,840 + 2,160)

Wave 5 (4th follow-up) completes

3,840 (80% of W4)

Wave 5 (4th follow-up) replenishment addresses

145,000

Wave 5 (4th follow-up) replenishment screened households

72,500 (50%)

Wave 5 (4th follow-up) replenishment completes

2,160 (3%)

Wave 6 (5th follow-up) total completes

4,800 (80%)

Baseline Supplemental Convenience Sample


Baseline supplemental screened respondents

5,000

Baseline supplemental completes

1,500 (30%)

  1. Procedures for the Collection of Information

Outcomes Study – Baseline Data Collection

At baseline, RTI will mail recruitment and screening materials to approximately 330,000 households. We expect to receive approximately 200,000 completed screeners and identify 4,000 eligible households with 6,000 eligible youth who will complete a baseline survey. The recruitment and study materials will consist of a sealed invitation letter and two sealed postcards with login credentials that will be used to invite an adult in the household to access the study web page to learn more about the study and complete an online screener. An adult household member will complete the online screener, which will determine eligibility. RTI received a waiver of parental permission for youth ages 14 to 17 (or 14 to 18 in Alabama and Nebraska in accordance with state law) from Advarra IRB. Youth ages 14 to 17 will be routed to the youth assent screen directly. After parental permission and youth assent is obtained, RTI will invite youth to participate in the study by routing them to the baseline survey. Households will be assigned login credentials which will be sent to them through an invitation postcard or invitation letter. Following parental permission, parents and selected youth will be emailed links and credentials for youth to complete the survey, which youth can also use to log back in to complete the online survey or start the survey if they were not available at the time of screening (Attachment 23). The survey will be hosted on RTI’s secure servers using Blaise. Data are encrypted using https protocols and stored on secure Structured Query Language (SQL) databases.


In addition to the main data collection for the Outcomes Study, RTI will recruit an additional 1,500 participants ages 14 to 20 through social media (e.g., Facebook, Instagram) to complete the online self-administered survey. RTI has received a waiver of parental permission for participants 14 to 17. Participants who click on the social media ads will be directed to a screener to determine eligibility. If eligible, they will be routed to the assent or consent form and then to the baseline survey hosted on Qualtrics’ secured servers. Screener data and survey data will be stored separately on Qualtrics servers and encrypted at rest. RTI will use a Secure Sockets Layer (SSL) connection to download data from Qualtrics servers to RTI servers. The final study sample at baseline, combining the main and supplemental data collections, is approximately 7,500. RTI will administer the online surveys with subpopulations shown to be at higher risk of initiating use of cigarettes and electronic nicotine delivery systems (ENDS) products, such as youth who identify as LGBTQ+ and youth who have a mental health disorder.


Along with the extensive and increasing body of literature showing tobacco use disparities among LGBTQ+ populations, the White House issued the Executive Order on Advancing Equality for Lesbian, Gay, Bisexual, Transgender, Queer, and Intersex Individuals which includes obligations for federal agencies to collect SOGI data. The order states that, “advancing equity and full inclusion for LGBTQI+ individuals requires that the Federal Government use evidence and data to measure and address the disparities that LGBTQI+ individuals, families, and households face.” It also states that federal agencies must “describe disparities faced by LGBTQI+ individuals that could be better understood through Federal statistics and data collection” (White House, 2022).


Along with requirements to collect SOGI data from the highest levels of the federal government, LGBTQ+ community advocacy organizations are not discouraging the collection of SOGI data from youth and routinely conduct surveys on the health of LGBTQ+ youth. These organizations rely on data to be able to serve, support, and advocate for LGBTQ+ populations. Two such surveys include The Human Rights Campaign LGBTQ+ youth report (The Human Rights Campaign Foundation, 2018) and The Trevor Project National Survey on LGBTQ Youth Mental Health (The Trevor Project, 2020). From these and other data collection efforts, researchers and advocates for LGBTQ+ advocates are using data to gain important insights into the disparities that these youth experience and identify opportunities to address these disparities. To remain in line with community advocacy organizations, the federal government must not erase LGBTQ+ youth from federal data collections.


Outcomes Study – Follow-up Data Collection

All respondents who complete the baseline survey will be invited to participate in each of five follow-up surveys, which will occur approximately every six months over a four-year period. As the cohort will be aging over the study period, the data collected throughout the study will reflect information from youth ages 11 to 24.


We estimate that we will lose approximately 20% of respondents at each wave of data collection. Therefore, at follow-up 2 and 4, we will replenish the sample by sending additional “baseline” screeners to new households. We will mail recruitment and study materials to an additional 145,000 households and estimate that we’ll receive 72,500 completed screeners. For eligible households, we will ask the parent/guardian to list all eligible youth in their households that can be selected for participation in the study, a process called rostering. Replenishing the sample will allow us to obtain 6,000 youth respondents at FU2 and FU4 (3,840 from the original main sample, and 2,160 from the replenishment sample) and maintain a minimum study sample of 4,800 respondents at all study waves. Additionally, we will have a convenience sample of 1,500 youth at baseline. We estimate that we will lose approximately 20% of the baseline supplemental sample at each follow-up wave, resulting in 1,200 participants at follow-up 1, 960 at follow-up 2, 768 at follow-up 3, 614 at follow-up 4, and 492 at follow-up 5.


The youth surveys will include the same set of items at baseline and follow-up with the exception of items that will be revised to reflect changes in campaign messaging over time. The youth survey instrument includes measures of demographics; tobacco use behavior; intentions to use tobacco; media use and awareness; environmental questions; measures of awareness of and exposure to the campaign materials, and outcome constructs. Outcome constructs include beliefs targeted by messages, the impact of the campaign on psychosocial predictors and precursors of tobacco use behavior, health and addiction risk perceptions, perceived loss of control or threat to freedom expected from tobacco use, anticipated guilt, shame, and regret from tobacco use, tobacco use susceptibility, intention or willingness to use tobacco, and intention to quit and/or reduce daily consumption.


Additionally, the youth survey instrument will include items that measure self-reported exposure to other tobacco use prevention media campaigns. Data from these items, along with demographic and other confounding influences will be included in regression models as controls to help isolate the campaign effect, similar to published analyses evaluating previous cohorts of The Real Cost and other longitudinal media studies (Duke, et al., 2018; Duke, et al., 2019; Farrelly, et al., 2009; Farrelly, et al., 2017; MacMonegle, et al., 2022).


The attachments are provided in both English and Spanish. We will not be recruiting separate English-speaking and Spanish-speaking samples for this study. We are simply providing Spanish-language consent/assent forms and surveys for participants who prefer to complete them over the English-language versions. Regardless of what language the respondents complete the consent/assent and surveys in, the estimated burden hours are identical.


Power

We determined the effect sizes for two different research questions:


  1. Is there a relationship between amount of exposure of campaign advertising and average score on outcome constructs?

  2. Is the effect of awareness of advertising on perceived severity moderated by the data collection period? (i.e., is there a data collection wave by treatment interaction)?


To determine the effect sizes, we simulated data that had the structure and effective sample size of the proposed design. We used data from the evaluation of The Real Cost campaign, Cohort 2 to estimate patterns of treatment effects and correlations within primary sampling units and across individuals.


Analysis 1. The proposed study has 80% power to detect a relationship between awareness of advertising and perceived severity if the effect size is 0.12; this is considered a small effect size. Exhibit 2 displays the relationship between perceived severity we can detect given the level of advertising awareness using mean values. The population standard deviations with each level of awareness are 1. Adding a constant to each value will not affect the power.


Exhibit 2. Mean value of perceived severity given the level of advertising awareness.


Awareness of advertising

0

1

2

3

4

Perceived severity

2.938

2.969

3.000

3.031

3.062

Analysis 2. Simulation approach

The proposed study has 80% power to detect an interaction between awareness of advertising and data collection wave in a model that predicts perceived severity when the coefficient of the interaction is 0.03 and the population standard deviation of the outcome (perceived severity) is 1. The following formula describes the relationship that has 80% power:


Perceived severity = 3 + 0.03 * wave * awareness + N (0,1)


where N (0,1) is a random variable from a standard normal distribution. Adding a constant to this equation will not affect the power. Exhibit 3 displays the relationship between perceived severity given the level of advertising awareness we can detect with 80% power.


Exhibit 3. The mean value of perceived severity of metals based on the value of awareness and wave

Mean perceived severity

Wave

Awareness of advertising

0

1

2

3

4

Baseline

3.00

3.00

3.00

3.00

3.00

Follow-up 1

3.00

3.03

3.06

3.09

3.12

Follow-up 2

3.00

3.06

3.12

3.18

3.25

Follow-up 3

3.00

3.09

3.18

3.28

3.37

Follow-up 4

3.00

3.12

3.22

3.37

3.49

Follow-up 5

3.00

3.15

3.28

3.46

3.61


Data Suppression Techniques


An additional approach to secure sensitive data will be to employ data suppression techniques to protect any PII data from survey respondents in the evaluation. Data suppression is a readily applied technique where estimates are not reported if they could result in disclosure of a participant’s identity or are deemed to be unstable (i.e., low precision).


Based on well-established guidelines followed by the Center for Tobacco Products (CTP) Office of Science (OS) guidelines, as well as the National Center for Health Statistics (NCHS), data suppression will be used if any of the following conditions are met: (1) The coefficient of variation of the proportion or estimate is > 30% and/or (2) n < 50, where n is the unweighted sample size in the denominator of the estimated proportion or the denominator used for calculating the estimate.


To further reduce disclosure, we will follow further established guidance from CTP/OS:

  • Each estimate (or table cell) must be generated based on a numerator of 3. This includes means, total, numerators of proportions, all table cell counts, and marginal counts. If an estimate is based on a numerator of 1 or 2, it will be combined with another category.

  • We will work to ensure table differencing (i.e., calculating the sample size of a small cell from cells of another related table) does not occur by using consistent categories across tables.

  • Continuous/ordered variables will be presented so that extreme values pertaining to an individual are not evident.

  1. Methods to Maximize Response Rates and Deal with Nonresponse

The ability to recruit potential respondents for the baseline survey and maintain their participation across all survey waves will be important to the success of this study.


At baseline and each wave of follow-up data collection, youth respondents who participate in the main data collection will be offered a $30 incentive to complete the survey during an early release period that will run for approximately three weeks. Subsequently, youth respondents will be offered a $25 incentive to complete the survey after the early release period. For the supplemental baseline data collection, youth will receive a $25 incentive. Youth in the supplemental baseline data collection will participate in the main data collection at follow-up and will receive a $30 incentive (or $25 incentive if completed after 3 weeks of the start of the data collection wave). Studies suggest that this incentive approach can increase response rates and reduce costs and nonresponse. In addition, the study will use procedures designed to maximize respondent participation. For example, e-mail reminders and text messages will be sent to encourage participants to complete the survey. We will direct respondents to a website that may be updated at each wave to show progress and encourage engagement.


For longitudinal analyses, the sample is limited to those who have completed each wave. Probability weights are generated for the longitudinal sample as well as for the full sample in each wave and are calibrated to help mitigate non-response bias. Methods such as data imputation may also be used to maximize the data and address nonresponse bias.

  1. Test of Procedures or Methods to be Undertaken

Prior to launching the baseline or follow up surveys, we may field a nine-case usability test of the screening procedure and instrument. This usability testing will provide the study team with feedback from respondents who are similar to the target sample and help us understand if the procedure needs to be adjusted to improve response rates for screening. We may add instructions to the screener or adjust how documents are presented on the web based on this feedback.


Additionally, with a separate sample, we will field a nine-case cognitive interview pre-test of selected items from the survey instrument, with the exception of a few additional prompting questions, to assess overall clarity of instrument questions and respondents’ opinions on aspects of the survey that are unclear. The purpose of the cognitive interviews is to identify areas of the survey that are either unclear or difficult to understand.


In addition to usability testing and cognitive interviews, RTI staff will conduct rigorous internal testing of the online screener and survey instrument prior to fielding at baseline. Evaluators will review the online test version of the instrument used to verify that instrument skip patterns function properly, multimedia included in the survey is functioning properly, and all survey questions are worded correctly and in accordance with the instrument approved by OMB. We will review diagnostic data on average time of survey completion, survey completion patterns (e.g., are there any concentrations of missing data?), and other aspects related to the proper function of the survey.


Finally, minor revisions to the survey may be necessary given the media development process and possibility of changes in campaign implementation. We may remove a small number of items or response options from the survey if we find they are no longer relevant at the time of data collection. For example, items pertaining to a particular ad that is no longer on air may be removed. Other examples include if a particular tobacco product is no longer on the market or if a particular type of streaming service is no longer available; these items would be removed from the survey as they are no longer relevant. However, every effort will be made to minimize changes to the survey.

  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The following individuals inside the agency have been consulted on the design and statistical aspects of this information collection as well as plans for data analysis:


Debra Mekos

Social Scientist

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Ave

Silver Spring, MD 20993

Phone: 301-796-8754

E-mail: [email protected]



Lindsay Pitzer

Senior Scientist

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20993

Phone:240-620-9526

E-mail: [email protected]


Hibist Astatke

Social Scientist

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Ave

Silver Spring, MD 20993

Phone: 301-796-1038

E-mail: [email protected]


The following individuals outside the agency have been consulted on the survey development, statistical aspects of the design, plans for data analysis, and will conduct data collection and analysis:


Anna MacMonegle

Public Health Manager

RTI International

3040 E. Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-990-8427

E-mail: [email protected]


Nathaniel Taylor

Public Health Program Manager

RTI International

3040 E. Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-316-3523

Email: [email protected]


LeTonya Chapman

Research Public Health Analyst

RTI International

3040 E. Cornwallis Road

Research Triangle Park, NC 27709

Tel: 770-407-4928

[email protected]



James Nonnemaker

Senior Research Economist

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-7064

E-mail: [email protected]


Chris Ellis

Senior Survey Director

RTI International

3040 E. Cornwallis Road

Research Triangle Park, NC 27709

Tel: 919-541-6480

[email protected]


Patty LeBaron

Survey Methodologist

RTI International

3040 E. Cornwallis Road

Research Triangle Park, NC 27709

Tel: 312-777-5204

[email protected]


Burton Levine

Senior Research Statistician

RTI International

3040 E. Cornwallis Road

Research Triangle Park, NC 27709

Tel: 919-541-1252

[email protected]

References


Abreu, D. A., & Winters, F. (1999). Using monetary incentives to reduce attrition in the survey of income and program participation. Proceedings of the Survey Research Methods Section of the American Statistical Association.

Castiglioni, L., Pforr, K., & Krieger, U. (2008). The effect of incentives on response rates and panel attrition: Results of a controlled experiment. Survey Research Methods, 2(3), 151–158.

Centers for Disease Control and Prevention. (2012). Youth Risk Behavior Surveillance–United States, 2011. Morbidity and Mortality Weekly Report, 61(4), 1162.

Davis, K. C., Nonnemaker, J., Duke, J., & Farrelly, M. C. (2013). Perceived effectiveness of cessation advertisements: The importance of audience reactions and practical implications for media campaign planning. Health Communication, 28(5), 461472. doi:10.1080/10410236.2012.696535

Davis, K. C., Uhrig, J., Bann, C., Rupert, D., & Fraze, J. (2011). Exploring African American women’s perceptions of a social marketing campaign to promote HIV testing. Social Marketing Quarterly, 17(3), 39–60.

Duke, J.C., Farrelly, M.C., Alexander, T.N., et al. (2018). Effect of a national tobacco public education campaign on youth’s risk perceptions and beliefs about smoking. American Journal of Health Promotion, 32(5), 1248-1256. Doi:10.1177/0890117117720745

Duke, J. C., MacMonegle, A. J., Nonnemaker, J. M., et al. (2019). Impact of The Real Cost media campaign on youth smoking initiation. American Journal of Preventive Medicine. 57(5), 645-651. Doi:10.1016/j.amepre.2019.06.011

Dillard, J. P., Shen, L., & Vail, R. G. (2007). Do perceived message effectiveness cause persuasion or vice versa? Seventeen consistent answers. Human Communication Research, 33, 467–488.

Dillard, J. P., Weber, K. M., & Vail, R. G. (2007). The relationship between the perceived and actual effectiveness of persuasive messages: A meta-analysis with implications for formative campaign research. Journal of Communication, 57, 613–631.

Farrelly, M. C., Davis, K. C., Haviland, M. L., Messeri, P., & Healton, C. G. (2005). Evidence of a dose-response relationship between “truth” antismoking ads and youth smoking prevalence. American Journal of Public Health, 95(3), 425431. doi: 10.2105/AJPH.2004.049692

Farrelly, M. C., Nonnemaker, J., Davis, K. C., et al. (2009). The influence of the national truth campaign on smoking initiation. American Journal of Preventive Medicine. 36. 379-384. doi:10.1016/j.amepre.2009.01.019.

Farrelly, M. C., Duke, J. C., Nonnemaker, J., et al. (2017). Association between The Real Cost media campaign and smoking initiation among youths – United States, 2014-2016. MMWR.66(2).

The Human Rights Campaign Foundation (2018). 2018 LGBTQ Youth Report. Retrieved from https://hrc-prod-requests.s3-us-west-2.amazonaws.com/files/assets/resources/2018-YouthReport-NoVid.pdf

Jäckle, A., & Lynn, P. (2008). Respondent incentives in a multi-mode panel survey: Cumulative effects on nonresponse and bias. Survey Methodology, 34(1), 105–117.

Janega, J. B., Murray, D. M., Varnell, S. P., Blitstein, J. L., Birnbaum, A. S., & Lytle, L. A. (2004). Assessing the most powerful analysis method for schools intervention studies with alcohol, tobacco, and other drug outcomes. Addictive Behaviors, 29(3), 595606.

MacMonegle, A. J., Smith, A. A., Duke, J., et al. (2022). Effects of a national campaign on youth beliefs and perceptions about electronic cigarettes and smoking. Preventing Chronic Disease. 19(E16). Doi:10.5888/pcd19.210332.

McMichael, J., & Chen, P. (2015). Using census public use microdata areas (PUMAs) as primary sampling units in area probability household surveys. In JSM Proceedings, Survey Research Methods Section, pp. 2281–2288. Alexandria: American Statistical Association.

Murray, D. M., & Blitstein, J. L. (2003). Methods to reduce the impact of intraclass correlation in group-randomized trials. Evaluation Review, 27(1), 79103.

Murray, D. M., & Short, B. J. (1997). Intraclass correlation among measures related to tobacco-smoking by adolescents: Estimates, correlates, and applications in intervention studies. Addictive Behaviors, 22(1), 112.

Shettle, C., & Mooney, G. (1999). Monetary incentives in U.S. government surveys. Journal of Official Statistics, 15, 231–250.

Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey Nonresponse (p. 163–177). New York, NY: Wiley.

Snyder, L. B., Hamilton, M. A., Mitchell, E. W., Kiwanuka-Tondo, J., Fleming-Milici, F., & Proctor, D. (2004). A meta-analysis of the effect of mediated health communication campaigns on behavior change in the United States. Journal of Health Communications, 9, 7196.

Substance Abuse and Mental Health Services Administration (SAMHSA). (2012). Results from the 2011 National Survey on Drug Use and Health: Summary of national findings. NSDUH Series H-44, HHS Publication No. (SMA) 12-4713. Rockville, MD: Substance Abuse and Mental Health Services Administration.

The Trevor Project (2020). National Survey on LGBTQ Youth Mental Health 2020. Retrieved from https://www.thetrevorproject.org/survey-2020/

U.S. Department of Health and Human Services (USDHHS). (2006). The health consequences of involuntary exposure to tobacco smoke: A report of the Surgeon General. Atlanta, GA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, Coordinating Center for Health Promotion, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health.

Wakefield, M. A., Spittal, M. J., Yong, H-H., Durkin, S. J., & Borland, R. (2011). Effects of mass media campaign exposure intensity and durability on quit attempts in a population-based cohort study. Health Education Research, 26(6), 988–997.

The White House. (2022). Executive Order on Advancing Equality for Lesbian, Gay, Bisexual, Transgender, Queer, and Intersex Individuals. Retrieved from https://www.whitehouse.gov/briefing-room/presidential-actions/2022/06/15/executive-order-on-advancing-equality-for-lesbian-gay-bisexual-transgender-queer-and-intersex-individuals/




14





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy