2017 and 2019 NATIONAL YOUTH RISK BEHAVIOR SURVEY
OMB# 0920-0493 exp. 09/31/2015
SUPPORTING STATEMENT
PART B
Submitted by:
Nancy Brener,
PhD, Project Officer
Division of Adolescent and School
Health
National Center for HIV/AIDS, Viral Hepatitis, STD, and
TB Prevention
8 Corporate Square
Corporate Square Office Park
Atlanta, GA
30329
404-718-8133 (voice); 404-718-8010 (fax)
[email protected]
Centers
for Disease Control and Prevention
Department of Health and
Human Services
TABLE OF CONTENTS
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS 4
B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS 4
B.2 PROCEDURES FOR THE COLLECTION OF INFORMATION 5
B.3 METHODS TO MAXIMIZE RESPONSE RATES AND DEAL WITH NO RESPONSE 12
B.4 TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN 13
B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS AND INDIVIDUALS COLLECTING AND/OR ANALYZING DATA 14
The universe for the study will consist of public and private school students in grades 9, 10, 11, and 12 in the 50 states and the District of Columbia.
The sampling frame for schools has been obtained from MDR, Inc. The MDR data encompass both private and public schools and include the latest data from the Common Core of Data from the National Center for Education Statistics (NCES). MDR school-level files also include data on enrollments by grade and minority enrollments. For more than 40 years, MDR has provided information on K-12, higher education, library, early childhood, and related education organizations. The commercial sampling frame provided by MDR is nationally recognized as the most complete, current, and accurate education databases available in the industry.
Table B-1 displays the current distribution of schools nationally by urban status and type of school. The urban status variable has been updated to reflect the new classification based on the NCES Locales, a variable now also available in the MDR data files.
Distribution of Schools Nationally by Urban Status and School Type
Metro Status |
File Type |
Total |
||
Frequency Percent |
Public |
Private |
Catholic |
|
City-Large |
2212 8.67% |
647 2.54% |
248 0.97% |
3107 12.18% |
City-Midsize
|
864 3.39% |
292 1.15% |
84 0.33% |
1240 4.86% |
City-Small
|
1126 4.42% |
326 1.28% |
113 0.44% |
1565 6.14% |
Suburb-large |
3618 14.19% |
1070 4.20% |
226 0.89% |
6,744 19.27% |
Suburb-Midsize |
489 1.92% |
137 0.54% |
17 0.07% |
643 2.52% |
Suburb-Small
|
339 1.33% |
89 0.35% |
13 0.05% |
441 1.73% |
Town-Fringe
|
401 1.57% |
34 0.13% |
6 0.02% |
441 1.73% |
Town-Distant |
1590 6.23% |
229 0.90% |
59 0.23% |
1878 7.36% |
Town-remote
|
1113 4.36% |
120 0.47% |
38 0.15% |
1271 4.98% |
Rural-Fringe |
3055 11.98% |
742 2.91% |
51 0.20% |
3848 15.09% |
Rural-Distant |
3242 12.71% |
258 1.01% |
4 0.02% |
3504 13.74% |
Rural-Remote |
2579 10.11% |
70 0.27% |
1 0.00% |
2650 10.39% |
Total |
20628 80.89% |
4014 15.74% |
860 3.37% |
25502 100.00% |
Sampling or other respondent selection method used: Students will be selected using the procedures described in detail below. To briefly summarize, for each YRBS cycle, a nationally representative sample of students will be selected using a three-stage stratified cluster sample. Primary Sampling Units (counties) and Secondary Sampling Units (schools) within selected counties will be selected with probability proportional to size (PPS) selection methods. Schools will be sampled from a frame that includes all US public, private and Catholic schools. In each selected school, one class will be selected in each grade to participate, except in high minority schools where two classes per grade will be selected. All students in selected classes are eligible to participate.
Expected response rates for the data collection: The average participation rates over the 13 prior cycles of YRBS are 77% for schools and 86% for students. We assume these historical average participation rates in preparing the sample design for the 2017 and 2019 YRBS.
Actual response rates achieved during the last collection period: During the most recent cycle of the YRBS, conducted in 2015, the participation rates were 69% for schools and 86% for students.
Statistical justification for all sample sizes: The expected student sample size is approximately 22,673 students before nonresponse and is necessary to meet study precision requirements. The sample size is calculated by inflating the sample size that would be required under the assumptions of simple random sampling by historical design effects (to account for the complex sampling design) and participation rates to account for nonresponse at both the student and school levels.
Statistical Methodology for Stratification and Sample Selection
For each YRBS cycle, a probability sample will be selected that will support national estimates among students in grades 9-12 overall, and by age or grade, sex, and race/ethnicity (white, black, Hispanic). The design also will support sex-specific estimates by grade and race/ethnicity and racial/ethnic-specific estimates by grade. A detailed description of the sampling design may be found in Appendix M.
Sampling Frame. The sampling frame will stratify the 50 states and the District of Columbia by urbanicity and minority composition. The sampling frame is structured into geographically defined units, called primary sampling units (PSUs) defined as a county or groups of contiguous counties (except when they are unaffiliated cities). The stratification by minority composition will divide the PSUs into eight groups based on the percentages of blacks and Hispanics in the PSU. This is accomplished in two steps. First, each PSU is stratified into either the Hispanic strata or black strata based on whether there is a higher percentage of Hispanic or black enrolled students in the PSU. Each stratum will then be subdivided into four strata depending on the percentage of black or Hispanic enrolled students, as appropriate, in the PSU. The eight racial/ethnic-oriented strata will each be further divided by urban status defined as being in one of the 54 largest Metropolitan Statistical Areas (MSA) versus not. In addition, the first-stage PSU sample will be implicitly stratified by geography using 5-digit zip code areas.
Selection of PSUs. Fifty-four PSUs will be selected with probability proportional to the student enrollment in the PSU within strata. The allocation of PSUs to the first-stage strata will be approximately in proportion to the total enrollment in the PSU. A proportional allocation tends to maximize the precision of overall survey estimates.
Selection of Secondary Sampling Unites (SSUs). SSUs are comprised of either a single school (if the school includes each of grades 9-12) or multiple schools “linked” together. An SSU is comprised of multiple “linked” schools when the physical schools do not include all of grades 9-12. This is done to form school-based SSUs that provide coverage for all four grades in each unit. SSUs will be grouped by size as either large or small, depending upon whether or not they have 28 students or more per grade. In each selected PSU, at least three large SSUs will be selected with probability proportional to an aggregate enrollment measure, resulting in 162 selected SSUs (54 PSUs * 3 SSUs). In addition, from a sub-sample of 15 PSUs, one small SSU will be randomly selected to represent what is about 5.7% of the students nationwide (those attending small schools). A total of 177 SSUs will be selected (162 large and 15 small).These 177 SSUs will include approximately 200 physical schools.
Selection of Classes. Classes in each school are randomly selected based on two very specific scientific parameters to ensure a nationally representative sample. First, classes have to be selected in such a way that all students in the school have a chance to participate. Second, all classes must be mutually exclusive so that no student is selected more than once. In each school, once we have determined the type of class or time period from which classes will be selected, we randomly select the appropriate number of classes within each grade. To maintain acceptable school participation rates, it is essential that each school have input in the decision of which classes will be sampled in their school. Examples of class sampling frames that have been used in past surveys include all 2nd period classes or a required physical education class. As long as the scientific sampling parameters are met, we work with each school to identify a classroom sampling frame that will work best for each school. One class will be selected in each eligible grade for all schools except for those schools with the highest percentages of black and Hispanic students. In those schools, two classes per grade will be selected.
Selection of Students. All students in a selected classroom are eligible for the study. Based on historical averages, in large SSUs each selected class will include at least 28 students and each small SSU will supply a total of 63 students. Of the 162 large schools, approximately 20% (n=32) will be “high minority” and 2 classes will be selected from each grade. In the remaining 130 selected large schools, 1 class per grade will be selected. Therefore, we expect to select approximately 21,728 students from large SSUs [(32 SSUs * 8 classes * 28 students = 7,168 students) + (130 + 4 classes *28 students = 14,560 students) = 21,728 students] and approximately 945 students (15 SSUs * 63 students) from small SSUs.
Refusals. School districts, schools, and students who refuse to participate in the study, and students whose parents refuse to give permission, will not be replaced in the sample. We will record the characteristics of schools that refuse for analysis of potential study biases. Accounting for school and student nonresponse, we expect approximately 15,194 participating students.
Estimation and Justification of Sample Size
The YRBS is designed to produce estimates with error margins of ±5 percent:
95 percent confidence for domains defined by grade, sex, or race/ethnicity;
95 percent confidence for domains defined by crossing grade by sex, and race/ethnicity by sex; and
90 percent confidence for domains formed by crossing grade with race/ethnicity.
The 2017 and 2019 YRBS sample size and design will be consistent with the sample size and design used in the 2015 YRBS cycle because these design parameters met the levels of precision required for CDC's purposes. Minor design refinements are made to account for the changing demographics of the in-school population of students. Specifically, the design is adjusted to account for the increasing percentage of Hispanic students.
For the 2013 cycle, we performed an extensive simulation study that assessed whether it would be possible to achieve target sample sizes for minority groups without the oversampling of black and Hispanic students that has been done in previous YRBS cycles. In previous cycles, oversampling was accomplished in three ways:
Disproportional allocation of PSUs to strata based on the density of minority enrollment
Weighting coefficients in the measure of size used in sampling PSU’s with probabilities proportional to size (PPS) so that high minority PSUs and schools had an increased probability for selection
In schools with a high percentage of black or Hispanic students, selecting two classes in each grade rather than one.
Specifically, the simulation study investigated whether required subgroup sample sizes could be achieved with a proportional allocation to strata and an unweighted measure of size (student enrollment in the eligible grades). Both of these simplifications would lead to a design that is more efficient statistically and would lead to more precise survey estimates overall. The simulation results indicated that the required sample sizes would be achieved with the modified, more efficient sampling design. Therefore, we began implementing this change starting in the 2013 cycle and carried it through to the 2015 cycle, as well. This approach, however, led to a shortfall of participating black students. To alleviate this, we will use disproportional allocation in 2017 and 2019 for black students only (see Appendix M).
Another change instituted starting in the 2015 cycle of the YRBS is that of a minimum enrollment size for school eligibility; specifically, only schools with at least 25 students will be included in the frame. In the 2017 and 2019 cycles of the YRBS, this minimum threshold will be raised to 40 students based on analyses showing that the cost of recruiting and collecting data from very small schools outweighed the benefit of adding a relatively small number of students who attend these schools (see Appendix M).
The proposed samples for the 2017 and 2019 YRBS will each consist of 54 PSUs. At each grade level, at least three different schools will contribute classes of approximately 28 students each and at least three schools will be selected within each PSU. The actual number of physical schools will be more than 3 times 54, however, for two reasons. Schools that only span part of the grades of interest are combined during sampling to form sampling units, such that 1 sampled school could be composed of 2 physical schools. One small school will be selected in 15 subsample PSUs separately from large schools. As a result, approximately 200 schools will be selected into the sample. We will select one class per school per eligible grade, except for schools with the largest concentrations of minority students where we will select two classes per grade. We expect that the final sample will include approximately 22,673 selected students which will yield approximately 15,194 participating students.
School and Student Non-response. The average participation rates over the 13 prior cycles of YRBS are 77% for schools and 86% for students. We assume these historical average participation rates in preparing the sample design for the 2017 and 2019 YRBS.
Estimation and Statistical Testing Procedures
Sample data will be weighted by the reciprocal of the probability of case selection and adjusted for non-response. The resulting weights will be trimmed to reduce mean-squared error. Next, the strata weights will be adjusted to reflect true relative enrollments. Finally, the data will be post-stratified to match national distributions of high school students by race/ethnicity and grade.
Variances will be computed using linearization methods. YRBS data are also used for trend analyses where data for successive cycles are compared with statistical testing techniques. Statistical testing methods are used also to compare subgroup prevalence rates (e.g., male versus female students) for each cross-sectional survey.
Confidence intervals vary from estimate to estimate depending upon whether the estimate is for the full population or for a subset such as a particular grade or sex. Within a grouping, confidence intervals also vary depending on the level of the estimate and the design effect associated with the measure. Based on prior YRBS’s which had similar designs and sample sizes, we can expect the following:
Estimates among students overall or by grade or age, sex, and race/ethnicity (white, black, Hispanic) will be accurate at ±5 percent at 95 percent confidence.
For racial/ethnic estimates by grade (e.g., 11th grade Hispanics), about 70% will be accurate to within ±5 percent at 90 percent confidence.
The experience in using these data is that the levels of sampling errors involved are appropriate given the uses of the data for descriptive reporting and trend analysis.
Use of Less Frequent Than Annual Data Collection
As stated in A.6, the YRBS originally was planned and twice approved by OMB as an annual survey. Based on experience, it was determined in 1992 that it would be sufficient to address the programmatic needs of CDC and other Federal agencies to conduct the YRBS biennially. By shifting from an annual to biennial survey starting with the 1993 YRBS, burden has been reduced by half.
Survey Instrument
The YRBS questionnaire (Appendix J), contains 92 items which can be roughly divided into seven categories. The first category includes four demographic questions. The remaining questions address health risk behaviors in six categories: unintentional injuries and violence; tobacco use; alcohol and other drug use; sexual behaviors that contribute to HIV infection, other sexually transmitted diseases and unintended pregnancies; unhealthy dietary behaviors; and physical inactivity. Obesity (assessed by self-reported height and weight) and asthma also are assessed. The questions are all in a multiple-choice format and will be administered as a 12-page optically scannable questionnaire booklet.
Data Collection Procedures
Data will be collected by a small staff of professional data collectors, specially trained to conduct the YRBS. The time during the school day in which the survey is administered varies by school. This decision is made in coordination with each school to ensure that the type of class or period of the day selected for sampling 1) meets the scientific sampling parameters to ensure a nationally representative sample and 2) results in the least burden/highest possible acceptability for the school. The data collector will have direct responsibility for administering the survey to students. Data collectors will follow a survey administrator script (Appendix G).
Teachers will be asked to remain at the front or back of the classroom and not to walk around the room monitoring the aisles during survey administration because doing so could affect honest responses and compromise anonymity. Teachers also will be asked to identify students allowed to participate in the survey and to make sure non-participating students have appropriate alternative activities. The rationale for this is to increase the candor and comfort level of students. The only direct responsibility of teachers in data collection is to distribute and follow up on parental permission forms sent out prior to the scheduled date for data collection in the school. Teachers are provided with a parental permission form distribution script (Appendix H5) to follow when distributing permission forms to students.
The Data Collection Checklist (Appendix F) is completed by teachers to track which students have received parental permission to participate in the data collection. The teachers receive instructions on completing the Data Collection Checklist in the “Letter to Teachers in Participating Schools” (Appendix N1). The data collector will utilize the information on the Data Collection Checklist to identify students eligible for a make-up survey administration; this information will be recorded by the data collector on the “Make-up List and Instructions” document (Appendix N2).
In general, our data collection procedures have been designed to ensure that:
Protocol is followed in obtaining access to schools.
Everyday school activity schedules are disrupted minimally.
Administrative burden placed on teachers is minimal.
Parents give informed permission to participate in the survey.
Anonymity of student participation is maintained, with no punitive actions against nonparticipants.
Alternative activities are provided for nonparticipants.
Control over the quality of data is maintained.
Obtaining Access to and Support from Schools
All initial letters of invitation will be on CDC letterhead from the Department of Health and Human Services and signed by Stephanie Zaza, MD, Director, Division of Adolescent and School Health, Centers for Disease Control and Prevention. The procedures for gaining access to schools will have three major steps:
Notify
state education agencies (SEAs) in states with sampled schools and
invite states to participate. Obtain written approval for
participation at the SEA level. Verify existence and grade range of
selected schools. Obtain names of school districts in which schools
are located, school district addresses, names of district
superintendents, names of supportive school district contacts, and
general guidance on working with the selected school districts and
schools in the state. Request that the state notify school
districts that they may anticipate being contacted about the
survey.
Once
cleared at the state level, invite school districts in which
selected schools are located to participate in the study. For
Catholic schools and other private schools, invite the office
comparable to the school district office (e.g., diocesan office of
education). Obtain written approval for participation at the
district level. Verify existence of school, grade range, and other
information provided by the state. Request that the school district
notify schools that they may anticipate being contacted about the
survey. Request general guidance on working with the selected
schools.
Once cleared at the school district level, invite selected schools to participate. Verify information previously obtained about the school. Present the burden and benefits of participation in the survey. After a school agrees to participate, develop a tailor-made plan for collection of data in the school (e.g., select classes; determine whether survey will be administered to selected class sections simultaneously or in serial). Obtain written approval for participation at the school level. Ensure that all materials reach the school well in advance of when they are needed. Maintain contact with schools until all data collection activities have been completed.
Prior experience suggests the process of working with each state education agency, school district, and school will have unique features. Discussions with each education agency will recognize the organizational constraints and prevailing practices of the agency. Scripts for use in guiding these discussions may be found in Appendices K1 (state-level), K2 (district-level), and K3 (school-level). Appendix O contains copies of letters of invitation to states (Appendix O1), school districts (Appendix O2), and school administrators (Appendix O3). Appendix O also contains the YRBS Fact Sheet for Schools (O3a). A copy of the letter to be sent to schools once they have agreed to participate is found in Appendix O3b.
Informed Consent
The parental permission form and fact sheet (Appendices H1 and H2) informs both the student and the parent about an important activity in which the student has the opportunity to participate. By providing adequate information about the activity, it ensures that permission will be informed. The permission form indicates that a copy of the questionnaire will be available for review by parents at their child’s school. The parental permission forms will be made available in both English and Spanish.
A waiver of written student assent was obtained for the participation of children because this research presents no more than minimal risk to subjects, parental permission is required for participation, the waiver will not adversely affect the rights and welfare of the students because they are free to decline to take part, and it is thought that some students may perceive their responses are not anonymous if they are required to provide stated assent and sign a consent/assent document. Students are told “Participating in this survey is voluntary and your grade in this class will not be affected, whether or not you answer the questions.” Completion of the survey implies student assent.
Quality Control
Table B-2 lists the major means of quality control. As shown, the task of collecting quality data begins with a clear and explicit study protocol and ends with procedures for the coding, entry, and verification of collected data. In between these activities, and subsequent to data collector training, measures must be taken to reinforce training, to assist field staff who run into trouble, and to check on data collection techniques. Because the ultimate aim is production of a high quality database and reports, various quality assurance activities will be applied during the data collection phase.
Table B-2
Major Means of Quality Control
Survey Step |
Quality Control Procedures |
Mail Out |
Check inner vs. outer label for correspondence (5% sample). Verify that any errors in packaging were not systematic (100%). |
Previsit Logistics Verification |
Review data collection procedures with school personnel in each school to ensure that all preparatory activities were performed properly (100%). |
Receipt Control |
Verify that a sample of forms received the prior day were logged in and are stored in the proper location (5%). Require entry of staff ID in receipt control and all other transactions 100%). |
Telephone Contacts |
Monitor early sample of scheduling and follow-up telephone calls to ensure that the caller follows procedures, elicits proper information, and has proper demeanor (10%). |
Manual Editing |
Verify initial editing by all editors until standards are achieved (100%). Spot check editing by editor (5%). |
Computer Scanning |
Key enter questionnaires that are not scannable (100%). Remove any scannable forms that reflect intentional misuse by respondent (100%). |
Expected Response Rates
While we aim for an 80% school participation rate, the historical average for this study over the last 13 cycles of YRBS has ranged between 69% and 81%, with an average of 77%. We have conservatively assumed the average school participation rate for the purposes of sample design. The addition of a $500 token of appreciation for each school (as suggested by OMB in 1999) has helped maintain and perhaps slightly increase school participation rates. Even before the addition of this procedure, the YRBS set the standard for response rates among federally funded national, school-based, health related surveys of high school students. For example, the widely cited Monitoring the Future survey (formerly known as the High School Senior Survey), achieves substantially lower participation rates than the YRBS, even though Monitoring the Future contains less sensitive questions. The participation rates established by the YRBS are the product of the application of proven and tested procedures for maximizing school and student participation.
As indicated in A.16, it is highly desirable to complete data collection before the final two months of school. Schools are very busy then with testing and attendance can be very unstable, especially among twelfth grade students.
Methods for Maximizing Responses and Handling Non-Response
We distinguish among six potential types of nonresponse problems: refusal to participate by a selected school district, school, teacher, parent, or student; and collection of incomplete information from a student.
To minimize refusals at all levels--from school district to student--we will use a variety of techniques, emphasizing the importance of the survey. Given the high visibility and subject matter of the survey, we expect that some school districts or schools will need to place the issue of survey participation before the school board. To increase the likelihood of an affirmative decision, we will: (1) work through the SEA to communicate its support of the survey to school districts and schools; (2) indicate that the survey is being sponsored by CDC and has the support of Federal and state agencies; (3) convey to school districts and schools that the survey has the endorsement of many key national educational and health associations, such as the American Academy of Pediatrics, American Association of School Administrators, Association of State and Territorial Health Officials, National Association of Secondary School Principals, National Catholic Educational Association, National PTA, National Association of State Boards of Education, Council of Chief State School Officers, the National Education Association, and the National School Boards Association; (4) maintain a toll-free hotline to answer questions from school district and school officials, teachers, parents, and students throughout the process of recruiting schools and obtaining parental permission for the student’s participation; (5) comply with all requirements from school districts in preparing written proposals for survey clearance; (6) convey a willingness to appear in person, if needed, to present the survey before a school board, research committee, or other local entity tasked with reviewing the survey; (7) offer schools a monetary token of appreciation of $500, consistent with recommendations OMB previously made, and implemented in the national YRBS since 2001.
The sampling plan does not allow for the replacement of schools that refuse to participate due to concern that replacing schools would introduce bias. All participating SEAs, school districts, and schools also will be notified when the survey results are available for download from CDC’s website.
Maximizing responses and dealing with refusals from parents, teachers, and students require different strategies. Parental permission form reminders (Appendices H3 and H4) will be sent to parents who have not returned parental permission forms within an agreed upon time period (e.g., 3 days); those who do not respond to the reminder will be sent a second and final reminder. The permission form will provide a telephone number at CDC that parents may call to have questions answered before agreeing to give permission for their child's participation. Permission forms will be available in English, Spanish, and other languages as required by dominant languages spoken by parents in selected schools. Field staff will be available on location to answer questions from parents who remain uncertain of permission. Bilingual field staff will be used in locations with high Hispanic concentrations (e.g., California, Florida, New York City, and Texas).
Teacher refusals to cooperate with the study are not expected to be a problem because schools already will have agreed to participate and burden to teachers is minimal. Refusals by students whose parents have consented also are expected to be minimal. No punitive action will be taken against a nonconsenting student. Nonconsenting students will not be replaced. Data will be analyzed to determine if student nonresponse introduces any biases.
To minimize the likelihood of missing values on the questionnaire, students will be reminded in writing in the questionnaire booklet and verbally by the survey administrator to review the optically scannable questionnaire before turning it in to verify that: (1) each question has been answered, (2) only one oval is filled in for each question with the exception of the question on race/ethnicity, and (3) each response has been entered with a No. 2 pencil, fills the oval, and is dark. A No. 2 pencil will be provided to each survey participant to reduce the likelihood that responses will not scan properly, which would produce missing values. In addition, when completed questionnaires are visually scanned later at project headquarters, any oval that is lightly filled in will be darkened (unless they appear to be erasures) and stray marks will be erased before the forms are scanned. Missing values for an individual student on the survey will not be imputed.
YRBS questionnaire items were originally tested by the NCHS
laboratories. The 1993 special issue of Public Health Reports
on the development of the Youth Risk Behavior Surveillance System
describes the development and testing process. A limited pretest of
the questionnaire on nine respondents was conducted in November 1989
by the contractor in the Prince George's County, Maryland school
system in accord with OMB guidelines. The pretest was conducted to:
Quantify
respondent burden.
Test
survey administrator instructions and procedures.
Verify
the overall feasibility of the survey approach.
Identify needed changes in the instruments or instructions to control/reduce burden.
The pilot test sharpened the articulation of certain survey questions and produced an empirical estimate of the survey burden.
The YRBS questionnaire has been used extensively in 13 prior national school-based surveys approved by OMB, as well as at the state and local levels. Further pilot testing in accord with OMB guidelines has been performed on new questions.
Statistical aspects of the study have been reviewed by the individuals listed below.
Michael T. Errecart, PhD (deceased)
Macro International Inc.
Ronaldo Iachan, PhD
ICF International Inc. (formerly Macro International Inc.)
530 Gaither Road, Suite 500
Rockville, Maryland 20850
Phone: (301) 572-0538
E-mail: [email protected]
William Robb, MS, MBA (now retired)
ICF International Inc. (formerly Macro International Inc.)
126 College Street
Burlington, VT 05401
Phone: (802) 264-3713
E-mail: [email protected]
Within the agency, the following individual will be responsible for receiving and approving contract deliverables and will have primary responsibility for data analysis:
Nancy Brener, PhD
Team Leader, Survey Operations and
Dissemination Team
Division of Adolescent and School Health
Centers for Disease Control and Prevention
1600 Clifton Road, NE
Mailstop E-75
Atlanta, GA 30329
Phone:
404-718-8133
Email: [email protected]
The representative of the contractor responsible for conducting the planned data collection is:
Katherine H. Flint, MA
Vice President
ICF International Inc. (formerly Macro International Inc.)
530 Gaither Road, Suite 500
Rockville, Maryland 20850
Phone: (301) 572-0333
E-mail: [email protected]
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Windows User |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |