0910-0788 PRA SS_Part B_FU4

0910-0788 PRA SS_Part B_FU4.docx

Evaluation of the Food and Drug Administration's 'Fresh Empire' Multicultural Youth Tobacco Prevention Campaign.

OMB: 0910-0788

Document [docx]
Download: docx | pdf


B. Statistical Methods

  1. Respondent Universe and Sampling Methods


The primary outcome study consists of a pre-test survey and post-test cross-sectional surveys with an embedded longitudinal cohort in campaign and control cities beginning approximately 6 months after campaign launch. The primary data collection strategy uses address-based sampling (ABS) to screen households with eligible youth, followed by field interviews with eligible youth. All youth who participated in a previous survey wave will be re-contacted at each follow-up wave up to the age of 18. As needed to account for attrition, these longitudinal data will be supplemented with cross-sectional data in order to achieve our target sample size for each wave (n=2,100). Based on attrition of youth observed in the ExPECTT study, including attrition overall and among racial/ethnic minority youth, we expect that 65% of youth who completed a survey will complete the next survey. The pre-test wave of data collection included 2,194 12 to 17 year old multicultural youth who are predominantly African American, Hispanic, and Asian/Pacific Islander. We collected half of the sample from 15 campaign cities and the other half from 15 comparison cities. The post-test data collection sample size will be 10,500, equally split between campaign and comparison cities. The goal is to complete this data collection relying on ABS data collection. However, this strategy will be supplemented by recruitment of youth through social media, such as Twitter, Facebook and Instagram, followed by parental permission and online data collection. We originally estimated that between 2% and 7% of youth screened would complete the screener, obtain parental permission and complete the survey. These estimates were based on the scant literature on this topic that focused entirely on Facebook for recruitment. We originally planned to recruit up to 500 youth 13 to 17 through social media (Twitter and Facebook) for the pre-test survey. In actuality, there were fewer participants recruited through social media platforms than anticipated for the pre-test survey. We have had much greater success recruiting participants via social media in the subsequent post-test surveys. In the post-test surveys, we use additional social media platforms, such as Instagram, to recruit youth. We are no longer using Twitter as part of the recruitment strategy due to higher advertising costs and lower survey completion rates compared to Facebook and Instagram. In the second post-test survey, we found that 47% of youth who screened as eligible completed the full survey. Of the 10,500 post-test surveys, approximately 3,500 will be completed by youth recruited through social media, such as Facebook and Instagram.


The media contractor developed a list of 60 potential cities for the campaign. From the potential 60 cities, 15 cities were randomly selected to serve as comparison cities, leaving 45 cities for the campaign. Subsequently, two of these cities were merged to form one site, resulting in 44 campaign cities. The 15 comparison cities were selected excluding the 15 largest cities so as not to remove a large segment of the campaign’s target audience. As a result, the 15 campaign cities were then selected from the remaining 30 markets. These 15 were selected to match the comparison cities in terms of region of the country and city size.


Address-based Sampling and Field Data Collection

Within cities, we select an equal number of census block groups. The census block groups serve as the areas in which our address sample is selected. To obtain the 2,100 completed interviews for the pre-test data collection, we started with 118,520 sampled addresses from RTI’s Enhanced Address-based Sample (ABS) Frame. This frame starts with addresses in the U.S. Postal Service (USPS) Computerized Delivery Sequence File and adds complementary data from commercial sources to better characterize household members’ demographics and lifestyles. We use this frame to identify households likely to have eligible youth. Nationally, 6.3% of households have at least one eligible youth. In addition, based on formative research by those developing the Fresh Empire Campaign, we expect about 20% of multicultural youth ages 12 to 17 will affiliate with a Hip Hop peer crowd. As a result, approximately 1.3% of households would have an eligible youth. Given the challenge of reaching such a small population, we rely on available indicators in RTI’s Enhanced ABS Frame to oversample households with a greater likelihood of having eligible youth. Our goal is to make comparisons between the campaign and comparison cities and not make claims that these results are fully representative of the target audience.


Based on prior experience using this frame to identify youth, we expect that 35.4% of selected households will have at least one youth in the eligible age range and race/ethnicity. We are also assuming that 95% of the mailings sent to these addresses will be successfully delivered. In addition, based on recent RTI studies and published studies, we expect that 35% of households with eligible youth will complete the screener that will be used to identify youth in a Hip Hop peer crowd (20%). Of these, we expect that 75% will complete the outcome survey. Based on attrition of youth observed in the ExPECTT study, including attrition overall and among racial/ethnic minority youth, we expect that 65% of youth who completed a survey will complete the next survey. Exhibit 5 illustrates the sample selection estimates.


Exhibit 5. Addresses and the Associated Assumptions to Yield the Needed Number of Completes for Field Data Collection for the Pre-Test Survey


Activity

Pre-test


Selected addresses

118,520


Mail delivered

112,594 (95%)


Age and race eligible housing unit

39,894 (35%)


Age and race eligible youth who returns screener

13,963 (35%)


Eligible youth who affiliate with a Hip Hop peer crowd

2,793 (20%)

Completed interviews

2,100 (75%)



The mail-based screener will be used again during the fourth post-test survey. Based on our experiences with the pre-test survey and improved techniques for modeling demographics, we have updated some of the assumptions for the fourth post-test survey in Exhibit 6. We expect that 7.4% of households will return the screener, and 19% of returned screeners will be eligible youth who affiliate with a Hip Hop peer crowd. Of these, we expect that 80% will complete the outcome survey. We estimate that approximately 1,500 cross-sectional youth recruited via this method will complete the fourth post-test survey.


Exhibit 6. Addresses and the Associated Assumptions to Yield the Needed Number of Completes for Field Data Collection (Fourth Post-Test)


Activity

Fourth Post-Test


Selected addresses

133,357


Returned screeners

9,869 (7.4%)


Eligible youth who affiliate with a Hip Hop peer crowd

1,875 (19%)

Completed interviews

1,500 (80%)




Power Analysis

Statistical power estimates provide guidance on reasonable expectations for observing statistically significant change in outcomes of interest. This process requires an understanding of the study design, planned analyses (i.e., statistical model), expectations about the minimum detectable effect (MDE), as well as characteristics of the population and measures involved.


For the purpose of estimating statistical power for the Fresh Empire Campaign, we assume data collection will reflect a cross-sectional design among 30 cities, with 15 cities receiving Fresh Empire Campaign messages and 15 cities serving as a comparison group with minimal exposure to national television advertisements. The proposed impact analysis accounts for the repeated cross-sectional data collection using a generalized linear hierarchical regression model that assesses change in the proportion of youth that agree with a belief statement related to smoking tobacco (e.g., perceived approval, perceived prevalence, and perceived popularity). The test statistic will involve a two-tailed hypothesis test with a Type I error rate of 0.05 and a Type II error rate of 0.20, yielding 80% statistical power. Our parameter estimates include an intraclass correlation coefficient (ICC) of 0.01 to account for the geographic clustering of respondents and a variance inflation factor of 1.25 to account for potential imbalance across conditions. To some extent, these factors are offset by parameters that will serve to reduce variation. Those parameters include over-time correlation corrections of 0.55 at the cluster levels that account for repeated measures in the same DMAs as well as a 0.20 variance reduction at the individual level for the inclusion of demographic and socio-economic covariates. These parameter estimates are available in the published literature and supported by our experience conducting similar studies (Murray and Short 1997, Murray and Blitstein 2003, Janega, Murray et al., 2004, Farrelly, Davis et al., 2005).


The campaign evaluation’s goal is to be able to identify change of 10 percentage points or greater as statistically significant. There is little available data in the peer-reviewed literature on the level of agreement we can anticipate at baseline (i.e., the pre-test survey). Accordingly, we rely on the conservative assumption that 50% of youth will agree with belief items that align with campaign messages at baseline.


Given the parameters and assumptions detailed above, the impact evaluation of the Fresh Empire Campaign required data from 70 self-identified Hip Hop multicultural youth in each of the 30 media markets (N = 2,194) at the pre-test. In addition, we expect to accumulate 2,100 completed interviews approximately every 6 months during the post-test period for a total of 10,500 interviews. The data collection will end approximately 48 months following the campaign launch. The primary comparison for statistical power calculations is between the pre-test period and the final 2,100 complete interviews. However, the interim data collection will permit multiple comparisons between the pre- and post-test periods to accommodate evolving campaign strategies and monitoring campaign exposure and receptivity. The pre-post sample size of 2,100 is predicated on the assumption that agreement with campaign messages is 50% at baseline and increases to 60% at the end of data collection. If actual agreement at baseline is either higher or lower than this value, statistical power is improved and smaller program impacts can be detected with the same sample of respondents. This effect would result in an odds ratio of approximately 1.50, meaning that youth exposed to the campaign would be 1.50 times more likely than youth not exposed to the campaign to agree with campaign-targeted beliefs and perceptions.


Data Collection Via Social Media

To supplement the ABS strategy above, we will recruit additional multicultural youth 13 to 17 through social media platforms, such as Facebook and Instagram. We will post advertisements to social media and invite youth to complete the brief online screener to determine their eligibility. For this brief survey, we will only require youth assent. We will ask eligible youth to provide their parent or guardian’s contact information (name and telephone number) so that we can obtain their permission prior to inviting youth to complete the pre- or post-test survey. For 15 to 17 year-old youth recruited via social media for the post-test surveys, we will not require parental permission. For youth 13 to 14 years old, we will continue to require parental permission. For those youth, an interviewer will call the parent to obtain permission. A link to the survey will then be sent to the email address or mobile number indicated by the youth on the screener. If a parent has a question about the study, we will email or mail them the Question & Answer Fact Sheet (Online) (Attachment 14b). All youth, regardless of recruitment method, will be advised of the privacy of their data and be asked to provide their assent to participate before encountering the first survey question. All data will be disassociated from names, addresses, and other identifying information to ensure respondent privacy to the fullest extent of the law, and all data will be stored on secure RTI servers.


  1. Procedures for the Collection of Information

2.1 ABS Field Data Collection

This section describes the procedures for field data collection. Data collection began in late 2015, prior to the launch of the campaign, and will end 48 months post campaign launch. This data collection will allow timely feedback on the target audience’s awareness of and receptivity to campaign activities. Eligible youth will be screened by mail and interviewed in person. This approach provides a number of methodological advantages, including efficiency in identifying this hard-to-reach population, increased accuracy in measurement of key variables of interest, and reduced burden on study participants.


Sample Selection

An address-based sample will be drawn with the goal of oversampling households with youth ages 12 to 17 who are African American, Hispanic, or Asian/Pacific Islander. Sampled dwelling units (SDUs) will receive a lead letter and a brief mail screener along with a $2 bill as an incentive. The lead letter will describe the study and be addressed to a parent or guardian. The parent would first indicate whether or not there are any youth in the household 12 to 17. If not, they will be asked to indicate as much and return the survey in an enclosed, postage-paid envelope. If there is at least one child 12 to 17, we would ask the parent or guardian to provide their name and phone number and then pass the screener to their child to complete and return. If there are multiple youth ages 12 to 17 in the household, we would ask them to share the survey with the child with the next birthday. The youth would complete the brief survey and return in an enclosed, postage-paid envelope. To encourage participation, we will send a reminder postcard (see Attachment 9) followed by a final mailing with a follow up cover letter and another copy of the screening instrument (Attachment 10).


Screening

The pre-test wave of data collection began with a mail screening survey to identify eligible youth (Attachment 2). A mail screening survey will also be used in the fourth post-test survey. In the post-test waves, an online screener advertised on social media will be used to identify new sample members to replace those lost to follow-up from the longitudinal sample. Each mail screening survey is accompanied by a lead letter, explaining the study and the process for completion (Attachment 8). To be eligible, youth must match the target audience of the Fresh Empire Campaign. That is, they must be youth 12 to 17 and they must affiliate with a Hip Hop peer crowd. The sample will be predominantly African American, Hispanic, and Asian/Pacific Islander. To ensure that the youth who participate in the outcome evaluation match this target audience, eligible youth will be screened using the same method used by the agency implementing the Fresh Empire Campaign—Rescue. This is accomplished by presenting two arrays of photos—one with males and one with females representing various peer crowds. Respondents will be asked to evaluate the arrays and to rank order the three images depicting individuals who best represent and least represent their friend group in each array. Survey participants will be categorized as members of the Hip Hop peer crowd based on this exercise.


Completed screeners will be analyzed to determine youth’s eligibility. Eligible youth will then be contacted in person and invited to complete the outcome survey.


Recruitment

Before the interviewer’s arrival at the SDU, a Welcome to the Study letter (see Attachment 11) will be mailed to the selected addresses. This letter will briefly explain the purpose of the survey and request the cooperation of a parent or legal guardian aged 18 or older in each household. This letter will be printed on project-specific letterhead with the signature of RTI’s project Data Collection Task Leader.


Upon arrival at each SDU, the interviewer will refer an adult resident to this letter and answer any questions the person might have about the study. If the resident has no knowledge of the lead letter, the interviewer will provide another copy, explain that one was previously sent, and then answer any questions the person might have. If no one is home during the initial visit to the SDU, the interviewer will have the option to leave a card (see Attachment 12) to inform the residents that the interviewer plans to visit the household at a different time. Further visits will be made as soon as feasible after the initial visit. Interviewers will make at least four additional visits beyond the initial visit to each SDU to complete the interview.


If the interviewer is unable to contact a parent or legal guardian aged 18 or older at the SDU after repeated attempts, the field supervisor may send an unable-to-contact letter (see Attachment 13) to reiterate information provided in the lead letter and ask for participation in the study. If the interviewer is still unable to contact anyone at an SDU, the interviewer might send an additional call-me letter (see Attachment 13) to the SDU. The call-me letter will request that the residents call the field supervisor to set up an interview appointment.


When contact is made with an adult member of an SDU and introductory information about the study is communicated, the interviewer will present a Questions & Answers Fact Sheet (see Attachments 14a and 14c) for in-person interviews that provides answers to commonly asked questions. When a potential respondent refuses to cooperate in the interview, the interviewer will rely on their training and experience to accept the refusal in a positive manner. This technique will reduce the potential for creating an adversarial relationship between the residents and the interviewer that could preclude future visits. The supervisor might then request a refusal letter (see Attachment 13) be sent to the residence. The refusal letter will be tailored to the specific concerns expressed by the potential respondent and ask him or her to reconsider participating in the study. Refusal letters will also include the supervisor’s telephone number, in case the potential respondent has questions or would like to set up an appointment with the interviewer. Unless the respondent calls the supervisor or RTI’s office to refuse participation in the study, one further attempt to enlist the household’s cooperation will be made by specially selected interviewers with experience in addressing initial refusals. Specially trained interviewers will also be selected based on their proximity to the case to minimize travel costs.


All youth who participated in a previous survey wave will be re-contacted at each follow-up wave up to the age of 18. Households of youth who previously completed an in-person survey will receive a lead mailing reminding them of their initial participation and requesting further participation. The lead mailing will include a web address and requests that the selected youth participate in a web survey. During the third follow-up data collection period, we offered an additional $5 “early bird” incentive for longitudinal respondents who originally completed an in-person survey to encourage them to complete the survey online. We plan to offer this early bird incentive in the fourth and fifth post-test surveys. Youth that have not participated via web within a few weeks of receipt of the lead mailing will be followed up by an interviewer in person.


Interview Procedures

When an adult resident of a household agrees to cooperate with the study procedures, the interviewer will begin the interview procedures with the eligible youth participant. For the youth selected to complete the survey, the interviewer will follow these steps:


    • The interviewer obtains verbal permission from the parent or legal guardian for the selected youth before approaching the youth for participation in the study (Attachments 5 and 5a).

    • After obtaining parental permission, interviewers make every attempt to secure an interview setting in the respondent’s home that is as private as possible. In addition, the interview process, by design, includes techniques to afford privacy for the respondent. The self-administered portion of the interview maximizes privacy by giving control of the interview directly to the respondent. This allows the respondent to read the questions directly from the computer screen and then key his or her own responses into the computer via the keyboard.

    • The interviewer will obtain verbal assent or consent from the selected youth respondent. The assent or consent form, which will appear as the first visible screen on the laptop, will be designed to communicate the goals and procedures to youth aged 12 to 17 (Attachments 5 and 5a) and youth who have turned 18 (Attachment 5b). The interviewer will also read the assent or consent language to the youth before beginning the interview, to assure them that what they report will be kept confidential and to communicate the voluntary nature of participation and their right to refuse to answer any question asked.

    • When parental permission and youth assent or consent have been obtained, the interviewer will arrange for the youth respondent to self-administer the survey. The interviewer will turn the computer over to the youth to read the survey questions and enter responses to the questions directly into the computer (Attachment 1).


The purpose of the outcome survey is to measure youth’s awareness of the campaign, attitudes toward smoking; their tobacco-related behavior; intentions; self-efficacy; cessation intentions, motivation and behavior; attitudes, beliefs, risk perceptions and social norms; media use and awareness; and environment.


Incentives and Verification

After the interview is completed and before the verification information is collected, youth respondents will be given $25 for participation. Youth will receive an incentive receipt (Attachment 15). For verification purposes, one parent of some youth respondents will be contacted via telephone after the interview. Immediately following the interview, the interviewer will collect the parent’s contact information to be used for this verification call. The verification interviews will ask the youth respondent to answer a few questions confirming that the interview took place, that proper procedures were followed, and that the amount of time required to administer the interview was within the expected duration (Attachments 17 and 17a).


Data Security

All interview data will be transmitted within 48 hours via secure encrypted data transmission to RTI’s offices, where the data will be subsequently processed and prepared for analysis, reporting, and data file delivery. After transmission to RTI and confirmation of data receipt, all data will be wiped from all data collection devices used in the field.


2.2 Recruitment Via Social Media

To supplement this sample, RTI will place ads on social media platforms, such as Facebook and Instagram. Examples of these ads are included in Attachment 3a. As much as possible, these ads will be targeted toward potentially eligible respondents, who are ages 13-17, in the data collection cities, and potentially may affiliate with a hip hop peer crowd. When clicked, the ads will direct one to a web-based screener instrument (Attachment 3). Respondents who are deemed eligible following completion of the screener will then go on to obtain parental permission and provide youth assent (Attachments 5f and 5g) and complete the post-test survey (Attachment 1), which will be administered online. For cross-sectional respondents ages 15 to 17 recruited via social media for post-test survey rounds, parental permission is not required. Those youth will complete youth assent (Attachment 5e) and go on to complete the survey online.


All respondents who complete this survey online will receive a virtual gift card valued at $25. This includes respondents recruited via social media and members of the longitudinal cohort who opt to take the post-test web survey instead of having a field interviewer come to their house. During the third, fourth and fifth follow-up data collection periods, longitudinal respondents who originally completed an in-person survey and complete the survey online before the specified early bird date will receive an additional $5 “early bird” incentive.


  1. Methods to Maximize Response Rates and Deal with Nonresponse


The ability to obtain the cooperation of potential respondents in the baseline (pre-test) survey and maintain their participation across all survey waves will be important to the success of this study. In preparation for launching the baseline data collection, we will review procedures for enlisting respondent cooperation across a wide range of surveys, incorporate best practices from those surveys into the data collection procedures, and adapt the procedures through continuous improvement across the survey waves.


In addition to the $25 youth incentive, the study will use procedures designed to maximize respondent participation. Data collection procedures will begin with assignment of SDUs to specific interviewers at the start of data collection. When assigning cases, supervisors will take into account which interviewers are in closest proximity to the work, interviewer skill sets, and basic information such as demographics and size of each sampled area. Supervisors will assign cases to interviewers in ways designed to maximize production.


In post-test survey rounds, respondents who have completed previous surveys will have the option to take the survey online or in-person. Offering these two options may encourage potential non-responders to participate. The additional $5 early bird incentive will also facilitate timely data collection and promote online completion of the surveys.


To further improve response rates in post-test surveys, eligible 15 to 17 year-old respondents recruited via social media will no longer require parental permission. As a result of these new consenting procedures, sensitive questions from the post-test survey questionnaire have been removed.


When interviewers transmit their data from completed household screenings and interviews, the data will be summarized in daily reports posted to a Web-based case management system accessed by field supervisors and RTI’s data collection managers. On a daily basis, supervisors will use these reports to review response rates, production levels, and record of call information. This information will allow supervisors to determine each interviewer’s progress toward weekly production goals, when interviewers should attempt further contacts with SDUs, and how to handle challenging situations such as households that initially refuse to participate or households where the interviewer has been unable to contact anyone. Supervisors will discuss information and challenges with their interviewers each week. When feasible, cases will be transferred to other interviewers with different skill sets to assist with converting initial refusals into participating households. Cases might also be transferred among interviewers to improve production in areas where the original interviewer is not meeting response rate goals.


As noted in Section B.2, interviewers will use a Sorry I Missed You Card (Attachment 12) and the Question and Answer Fact Sheets (Household) (Attachments 14a and 14c) when needed to contact respondents and encourage participation. To assist efforts to convert households that initially refuse to participate, refusal letters (Attachment 13) tailored to specific refusal reasons will be used. Similarly, an unable-to-contact letter (Attachment 13) will be sent to an SDU if the interviewer has been unable to contact an adult resident after multiple attempts. When interviewers have been unable to gain access one or more SDUs due to an access barrier, such as a locked gate or doorperson, controlled access letters (Attachment 13) will be sent to the appropriate person or organization to obtain assistance in gaining access to these SDUs.


  1. Test of Procedures or Methods to be Undertaken


RTI will conduct rigorous internal testing of the survey instrument prior to its fielding. Evaluators will review the online test version of the instrument that we will use to verify that instrument skip patterns are functioning properly, delivery of campaign media materials is working properly, and that all survey questions are worded correctly and are in accordance with the instrument approved by OMB.


  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


The following individuals inside the agency have been consulted on the design and statistical aspects of this information collection as well as plans for data analysis:


April Brubach

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

9200 Corporate Boulevard

Rockville, MD 20850

Phone: 301-796-9214

E-mail: [email protected]


Gem Benoza

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 240-402-0088

E-mail: [email protected]


David Portnoy

Office of Science

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 301-796-9298

E-mail: [email protected]


Matthew Walker

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 240-402-3824

E-mail: [email protected]


Leah Hoffman

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 240-743-1777

E-mail: [email protected]


Janine Delahanty

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 240-402-9705

E-mail: [email protected]


Ollie Ganz

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 240-402-5389

E-mail: [email protected]


The following individuals outside of the agency have been consulted on questionnaire development. Additionally, input has been solicited and received from FDA on the design of this study, including participation by FDA in meetings with OMB:


Matthew Farrelly

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-6852

E-mail: [email protected]


Jennifer Duke

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-485-2269

E-mail: [email protected]


Jane Allen

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-597-5115

E-mail: [email protected]


Youn Lee

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-485-5536

E-mail: [email protected]


Amy Henes

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-7293

E-mail: [email protected]


Jamie Guillory

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-316-3725

E-mail: [email protected]


Patricia LeBaron

RTI International

230 W Monroe Suite 2100

Chicago, IL 60606

Phone: 312-777-5204

E-mail: [email protected]


Azucena Derecho

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-7231

E-mail: [email protected]


Stephen King

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-8094

Email: [email protected]


Pamela Rao

Akira Technologies, Inc.

1747 Pennsylvania Ave NW Suite 600

Washington, DC 20002

Phone: (202) 517-7187

Email: [email protected]


Xiaoquan Zhao

Department of Communication

George Mason University

Robinson Hall A, Room 307B

4400 University Drive, 3D6

Fairfax, VA 22030

Phone: 703-993-4008

E-mail: [email protected]


Jeff Jordan

Rescue

3436 Ray Street

San Diego, CA 92104

Phone: 619-231-7555 x 150

Email: [email protected]


Mayo Djakaria

Rescue

660 Pennsylvania Avenue SE, Suite 400

Washington, DC 20003

Phone: 619-231-7555 x 120 

Email: [email protected]


Dana Wagner

Rescue

660 Pennsylvania Ave SE, Suite 400

Washington, DC 20003

Phone: 619-231-7555 x 331

Email: [email protected]

14


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRao, Pamela *
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy