YRBS 2009-11 Justification Part B

YRBS 2009-11 Justification Part B.doc

2009 and 2011 Youth Risk Behavior Surveys

OMB: 0920-0493

Document [doc]
Download: doc | pdf


SUPPORTING STATEMENT FOR THE

2009 and 2011 NATIONAL YOUTH RISK BEHAVIOR SURVEYS



PART B

Submitted by:

Danice K. Eaton, MPH, PhD, Project Officer

Division of Adolescent and School Health

National Center for Chronic Disease Prevention and Health Promotion

4770 Buford Hwy, NE, MS K-33
Atlanta, GA 30341
770-488-6143 (voice); 770-488-6156 (fax)

[email protected]

Centers for Disease Control and Prevention

Department of Health and Human Services

April 9, 2008

TABLE OF CONTENTS



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Respondent Universe and Sampling Methods

2. Procedures for the Collection of Information


a. Statistical Methodology for Stratification and Sample Selection

b. Estimation and Justification of Sample Size

c. Estimation and Statistical Testing Procedures

d. Use of Less Frequent than Annual Data Collection

e. Survey Instrument

f. Data Collection Procedures

g. Obtaining Access to and Support from Schools

h. Informed Consent

i. Quality Control


3. Methods to Maximize Response Rates and Deal with Nonresponse

a. Expected Response Rates

b. Methods for Maximizing Response and Handling Non-Response


4. Tests of Procedures or Methods to be Undertaken


5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or

Analyzing Data


a. Statistical Review

b. Agency Responsibility

c. Responsibility for Data Collection


REFERENCES

LIST OF APPENDICES

  1. Authorizing Legislation


  1. 60-Day Federal Register Notice

  2. Summary of Public Comments and CDC’s Responses

  3. Rationale for Survey Questions


E. Youth Risk Behavior Survey Questionnaire


F. Youth Risk Behavior Survey Questionnaire Supplemental Documents

F1. Parental Permission Form Distribution Script

F2. Parental Permission Form and Fact Sheet (English Version)

F3. Parental Permission Form and Fact Sheet (Spanish Version)

F4. Parental Permission Form Reminder Notice (English Version)

F5. Parental Permission Form Reminder Notice (Spanish Version)

F6. Questionnaire Administration Guide

F7. Data Collector Confidentiality Agreement

G. Recruitment Scripts for the Youth Risk Behavior Survey

G1. State-level Recruitment Scripts for the Youth Risk Behavior Survey

G2. District-level Recruitment Scripts for the Youth Risk Behavior Survey

G3. School-level Recruitment Scripts for the Youth Risk Behavior Survey


H. Recruitment Scripts for the Youth Risk Behavior Survey Supplemental Documents

H1. State-level Recruitment Script for the Youth Risk Behavior Survey Supplemental Documents - State Letter of Invitation

H2. District-level Recruitment Script for the Youth Risk Behavior Survey Supplemental Documents - District Letter of Invitation

H3. School-level Recruitment Script for the Youth Risk Behavior Survey Supplemental Documents

H3a. School Letter of Invitation and YRBS Fact Sheet for Schools

H3b. Letter to Agreeing Schools


I. Data Collection Checklist for the Youth Risk Behavior Survey


J. Data Collection Checklist for the Youth Risk Behavior Survey Supplemental Documents

J1. Letter to Teachers in Participating Schools

J2. Make-up List and Instructions


K. Expert Reviewers for 1989 Consultations

L. IRB Approval Letter


M. Detailed Sampling and Weighting Plan


N. Sample Table Shells

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS


The universe for the study will consist of public and private school students in grades 9, 10, 11, and 12 in the 50 states and the District of Columbia.


The sampling frame for schools has been obtained from Quality Education Data (QED), Inc. QED data encompass both private and public schools and include the latest data from the Common Core of Data from the National Center for Education Statistics. QED school-level files also include data on enrollments by grade and minority enrollments. Table B-1 displays the current distribution of schools by urban status and type of school.


Table B-1

Distribution of Schools by Urban Status and School Type

Metro Status

School Type

Total

Frequency
Percent
Row Pct
Col Pct

Catholic

Private

Public

Unclassified

1
0.00
1.82
0.08

35
0.13
63.64
0.62

19
0.07
34.55
0.10

55
0.21


Urban

556
2.11
9.27
45.39

1,670
6.35
27.85
29.46

3,770
14.33
62.88
19.42

5,996
22.79


Suburban

592
2.25
5.37
48.33

2,859
10.87
25.93
50.44

7,573
28.79
68.70
39.01

11,024
41.91


Rural

76
0.29
0.82
6.20

1,104
4.20
11.96
19.48

8,051
30.61
87.22
41.47

9,231
35.09


Total

1,225
4.66

5,668
21.55

19,413
73.80

26,306
100.00






B.2 PROCEDURES FOR THE COLLECTION OF INFORMATION


B.2.a Statistical Methodology for Stratification and Sample Selection


A probability sample will be selected that will support national estimates by age or grade and gender for students in grades 9-12. The design also will support separate estimates, by grade, of the characteristics of white, Hispanic, and black students. A detailed description of the sampling design may be found in Appendix M.


Sampling Frame. The sampling frame will stratify the 50 states and the District of Columbia by region, urbanicity, and minority composition. The sample is structured into geographically defined units, called primary sampling units (PSUs) defined as a county or groups of contiguous counties (except when they are unaffiliated cities). The stratification by minority composition will divide the PSUs into eight groups on the percentages of blacks and Hispanics in the PSU. "High Hispanic" strata will have higher percentages of Hispanics than blacks; "High black" strata will have the reverse. Each stratum will then be subdivided into four strata depending on the percentage of blacks or Hispanics, as appropriate, in the PSU. The racial/ethnic-oriented strata will be further divided by urban status into two strata--Metropolitan Statistical Area (MSA) versus non-MSA. In addition, the first-stage PSU sample will be implicitly stratified by geography using 5-digit zip code areas.

Selection of PSUs. Fifty-seven PSUs will be selected with probability proportional to the student enrollment in the PSU within strata, giving disproportionate weight to blacks and Hispanics. The PSUs will be allocated to the first-stage strata in proportion to the sum of the measure of size of the PSUs in the strata. This procedure will over-allocate PSUs to the high-minority strata and will increase the chances of high-minority PSUs being selected.


Selection of Schools. Schools will be grouped by size as either large or small, depending upon whether or not they have 25 students or more per grade. Among large schools, at least three schools will be selected in each PSU with probability proportional to the weighted measure of enrollment by race/ethnicity. In addition to the total sample of large schools (n=180 selections anticipated), a random sample of 15 small schools will be taken to represent what is about 5.7% of the students nationwide (those attending small schools).

Selection of Classes. Classes will be selected randomly from a unit of organization that allows for the inclusion of each child once; i.e., they are collectively inclusive, but mutually exclusive. Usually, a list of sections of a mandatory subject, such as English, will be used. One class will be selected in each eligible grade for all schools except for those schools with the highest percentages of Hispanic or black students. Within the high-minority schools, two classes will be selected per eligible grade.


Selection of Students. All students in a selected classroom will be selected for the study.


Refusals. School districts and schools that refuse to participate in the study, and students whose parents refuse to give permission, will not be replaced in the sample. We will record the characteristics of schools that refuse for analysis of potential study biases.


B.2.b Estimation and Justification of Sample Size

Overview. The YRBS is designed to produce most estimates accurate to within ±5 percent at 95 percent confidence. Overall estimates and estimates by grade or gender or race/ethnicity meet this standard as do certain finer-grained analyses such as grade by gender or race/ethnicity by gender. A looser design target of ±5 percent at a 90 percent confidence level was established for estimates by grade and race/ethnicity due to the practical difficulties of obtaining nationally representative samples of minority students at an affordable level.

We propose to replicate in nearly all respects the sample size and design used in the 2005 and 2007 YRBS cycles because these design parameters met the levels of precision required for CDC's purposes. Minor design refinements may be expected in future surveys driven by the changing demographics of the in-school population. Current trends of increasing minority percentages, particularly for Hispanic students, will continue to influence the design in several areas:


  • The weighting function that over-samples minority students is being gradually adjusted downwards to give less weight to minority students. The sample sizes of minority students will stay about the same, but the statistical efficiency of the survey will improve.

  • The stratum boundaries based on the percentage of minority students are being re-computed to minimize variances using new frame data on minority composition.

  • The allocation of PSUs to high-minority strata is being changed, making it more statistically desirable for whole population estimates, while preserving the ability to produce estimates by race/ethnicity.


The proposed sample will consist of 57 primary sampling units (PSU). At each grade level, at least three different schools will contribute classes of approximately 25 students each. As a result, there will be a minimum of three schools selected within each PSU. The actual number of schools will be more than 3 times 57, however, as we consider, and add to the mix, two types of schools: a) schools that only span part of the grades of interest, and hence are combined to form sampling units, and b) small schools that are selected separately from large schools. As a result, approximately 195 to 200 schools will be selected into the sample. We will select one class per school per eligible grade, except for schools with the largest concentrations of minority students where we will select two classes per grade. We expect that a final sample of approximately 12,000 respondents will be obtained.


School and Student Non-response. The average participation rates over the ten prior cycles of YRBS are 77% for schools and 86% for students. In 2007, perhaps partly due to school incentives, the YRBS achieved a very high school participation rate (81%). We are assuming maintenance of historical average participation rates in preparing the sample design for the 2009 and 2011 YRBS.

B.2.c Estimation and Statistical Testing Procedures


Sample data will be weighted by the reciprocal of the probability of case selection and adjusted for non-response. The resulting weights will be trimmed to reduce mean-squared error. Next, the strata weights will be adjusted to reflect true relative enrollments rather than relative weighted enrollment. Finally, the data will be post-stratified to match national distributions of high school students by race/ethnicity and grade. Variances will be computed using linearization methods. YRBS data are also used for trend analyses where data for successive cycles are compared with statistical testing techniques. Statistical testing methods are used also to compare subgroup prevalence rates (e.g., male versus female students) for each cross-sectional survey.


Confidence intervals vary from estimate to estimate depending upon whether the estimate is for the full population or for a subset such as a particular grade or gender. Within a grouping, confidence intervals also vary depending on the level of the estimate and the design effect associated with the measure. Based on prior YRBS’s which had similar designs and sample sizes, we can expect the following:

  • Estimates by grade or by gender, or pooling grades/genders, will be more accurate than ±5 percent at 95 percent confidence.

  • For minority group estimates by grade (e.g., 11th grade Hispanics), about 70% will be accurate to within 5 percent at 90 percent confidence and about 85% will be accurate to within 7 percent at 90 percent confidence.

The experience in using these data is that the levels of sampling errors involved are appropriate given the uses of the data for descriptive reporting and trend analysis.


B.2.d Use of Less Frequent Than Annual Data Collection


As stated in A.6 above, the YRBS originally was planned and twice approved by OMB as an annual survey. Based on experience, it was determined in 1992 that it would be sufficient to address the programmatic needs of CDC and other Federal agencies to conduct the YRBS biennially. By shifting from an annual to biennial survey starting with the 1993 YRBS, burden has been reduced by half.


It is important that data be collected biennially to detect any changes in health risk behaviors by high school students that need to be addressed in school health programs, public education campaigns, demonstrations, and professional education/training, especially those sponsored by CDC. Due to the speed with which many of these problems, including the AIDS epidemic and tobacco use, will take an increasing toll in human suffering and financial burden, which will be heavily borne by the Federal government, it is imperative to conduct the survey biennially. School systems have the capacity to change their school health programs to help prevent health risk behaviors, such as tobacco use, that contribute to the leading causes of mortality and morbidity readily on an annual basis, if circumstances require.


B.2.e Survey Instrument


The YRBS questionnaire (Appendix E), contains 98 items which can be roughly divided into seven groups. The first four questions are demographic items. Most of the remaining questions address health risk behaviors in six topic areas: unintentional injuries and violence; tobacco use; alcohol and other drug use; sexual behaviors that contribute to HIV infection, other sexually transmitted diseases and unintended pregnancies; unhealthy dietary behaviors; and physical inactivity. In addition, self-reported height and weight are assessed to develop estimates of the prevalence of being overweight and at risk for overweight. The questions are all in a multiple-choice format and will be administered as a 12-page optically scannable questionnaire booklet.

B.2.f Data Collection Procedures


Data will be collected by a small staff of professional data collectors, specially trained to conduct the YRBS. The data collector will have direct responsibility for administering the survey to students. Data collectors will follow a questionnaire administration guide (Appendix F6). Teachers will be asked to remain at the front or back of the classroom and not to walk around the room monitoring the aisles during survey administration because doing so could affect honest responses and compromise anonymity. Teachers also will be asked to identify students allowed to participate in the survey and to make sure non-participating students have appropriate alternative activities. The rationale for this is to increase the candor and comfort level of students. The only direct responsibility of teachers in data collection is to distribute and follow up on parental permission forms sent out prior to the scheduled date for data collection in the school. Teachers are provided with a parental permission form distribution script (Appendix F1) to follow when distributing permission forms to students. The Data Collection Checklist (Appendix I) is completed by teachers to track which students have received parental permission to participate in the data collection. The teachers receive instructions on completing the Data Collection Checklist in the “Letter to Teachers in Participating Schools” (Appendix J1). The data collector will utilize the information on the Data Collection Checklist to identify students eligible for a make-up survey administration; this information will be recorded by the data collector on the “Make-up List and Instructions” document (Appendix J2). In general, our data collection procedures have been designed to ensure that:


  • Protocol is followed in obtaining access to schools.

  • Everyday school activity schedules are disrupted minimally.

  • Administrative burden placed on teachers is minimal.

  • Parents give informed permission to participate in the survey.

  • Anonymity of student participation is maintained, with no punitive actions against nonparticipants.

  • Alternative activities are provided for nonparticipants.

  • Control over the quality of data is maintained.


B.2.g Obtaining Access to and Support From Schools


All initial letters of invitation will be on CDC letterhead from the Department of Health and Human Services and signed by Howell Wechsler, Ed.D, M.P.H., Director, DASH, NCCDPHP at CDC. The procedures for gaining access to schools will have three major steps:


  • Notify state education agencies (SEAs) in states with sampled schools and invite states to participate. Obtain written approval for participation at the SEA level. Verify existence and grade range of selected schools. Obtain names of school districts in which schools are located, school district addresses, names of district superintendents, names of supportive school district contacts, and general guidance on working with the selected school districts and schools in the state. Request that the state notify school districts that they may anticipate being contacted about the survey.

  • Once cleared at the state level, invite school districts in which selected schools are located to participate in the study. For Catholic schools and other private schools, invite the office comparable to the school district office (e.g., diocesan office of education). Obtain written approval for participation at the district level. Verify existence of school, grade range, and other information provided by the state. Request that the school district notify schools that they may anticipate being contacted about the survey. Request general guidance on working with the selected schools.

  • Once cleared at the school district level, invite selected schools to participate. Verify information previously obtained about the school. Present the burden and benefits of participation in the survey. After a school agrees to participate, develop a tailor-made plan for collection of data in the school (e.g., select classes; determine whether survey will be administered to selected class sections simultaneously or in serial). Obtain written approval for participation at the school level. Ensure that all materials reach the school well in advance of when they are needed. Maintain contact with schools until all data collection activities have been completed.


Prior experience suggests the process of working with each state education agency, school district, and school will have unique features. Discussions with each education agency will recognize the organizational constraints and prevailing practices of the agency. Scripts for use in guiding these discussions may be found in Appendices G1 (state-level), G2 (district-level), and G3 (school-level). Appendix H contains copies of letters of invitation to states (Appendix H1), school districts (Appendix H2), and school administrators (Appendix H3a). Appendix H also contains the YRBS Fact Sheet for Schools (H3a). A copy of the letter to be sent to schools once they have agreed to participate is found in Appendix H3b.


B.2.h Informed Consent


The permission form (Appendices F2 & F3) informs both the student and the parent about an important activity in which the student has the opportunity to participate. By providing adequate information about the activity, it ensures that permission will be informed. In accord with the No Child Left Behind Act, the permission form indicates that a copy of the questionnaire will be available for review by parents at their child’s school. The parental permission forms will be made available in both English and Spanish.


A waiver of written student assent was obtained for the participation of children because this research presents no more than minimal risk to subjects, parental permission is required for participation, the waiver will not adversely affect the rights and welfare of the students because they are free to decline to take part, and it is thought that some students may perceive they are not anonymous if they are required to provide stated assent and sign a consent/assent document. Students are told “Participating in this survey is voluntary and your grade in this class will not be affected, whether or not you answer the questions.” Completion of the survey implies student assent.


B.2.i Quality Control


Table B-2 lists the major means of quality control. As shown, the task of collecting quality data begins with a clear and explicit study protocol and ends with procedures for the coding, entry, and verification of collected data. In between these activities, and subsequent to data collector training, measures must be taken to reinforce training, to assist field staff who run into trouble, and to check on data collection techniques. Because the ultimate aim is production of a high quality database and reports, various quality assurance activities will be applied during the data collection phase.


B.3 METHODS TO MAXIMIZE RESPONSE RATES AND DEAL WITH NONRESPONSE


B.3.a Expected Response Rates

While we aim for an 80% school participation rate (and achieved 81% on the 2007 YRBS), we have conservatively assumed school and student response rates of 77% and 86%, respectively, for the purposes of sample design. The 77% school and 86% student participation rates represent averages experienced over ten completed cycles of YRBS. The addition of a $500 incentive for each school (as suggested by OMB in 1999) has helped maintain and perhaps slightly increase school participation rates. Even before the addition of the school incentive, the YRBS set the standard for response rates among federally-funded national, school-based, health related surveys of high school students. For example, the widely cited Monitoring the Future survey (formerly known as the High School Senior Survey), achieves substantially lower participation rates than the YRBS, even though Monitoring the Future contains less sensitive questions. The participation rates established by the YRBS are the product of the application of proven and tested procedures for maximizing school and student participation.

As indicated in A.16.c, it is highly desirable to complete data collection before the final 2 months of school. Schools are very busy then with testing and attendance can be very unstable, especially among twelfth grade students.

Table B-2

Major Means of Quality Control



Survey Step


Quality Control Procedures


Mail Out


Check inner vs. outer label for correspondence (5% sample).

Verify that any errors in packaging were not systematic (100%).


Previsit Logistics

Verification


Review data collection procedures with school personnel in each school to ensure that all preparatory activities were performed properly (100%).


Receipt Control


Verify that a sample of forms received the prior day were logged in and are stored in the proper location (5%).

Require entry of staff ID in receipt control and all other transactions (100%).


Telephone Contacts


Monitor early sample of scheduling and follow-up telephone calls to ensure that the caller follows procedures, elicits proper information, and has proper demeanor (10%).


Manual Editing


Verify initial editing by all editors until standards are achieved (100%).

Spot check editing by editor (5%).


Computer Scanning


Key enter questionnaires that are not scannable (100%).

Remove any scannable form that reflects intentional misuse by a respondent (100%).


B.3.b Methods for Maximizing Responses and Handling Non-Response


We distinguish among six potential types of nonresponse problems: refusal to participate by a selected school district, school, teacher, parent, or student; and collection of incomplete information from a student.


To minimize refusals at all levels--from school district to student--we will use a variety of techniques, emphasizing the importance of the survey. Given the high visibility and subject matter of the survey, we expect that some school districts or schools will need to place the issue of survey participation before the school board. To increase the likelihood of an affirmative decision, we will: (1) work through the SEA to communicate its support of the survey to school districts and schools; (2) indicate that the survey is being sponsored by CDC and has the support of Federal and state agencies; (3) convey to school districts and schools that the survey has the endorsement of many key national educational and health associations, such as the National PTA, American Medical Association, National Association of State Boards of Education, Council of Chief State School Officers, the National Education Association, and the National School Boards Association; (4) maintain a toll-free hotline to answer questions from school district and school officials, teachers, parents, and students throughout the process of recruiting schools and obtaining parental permission for the student’s participation; (5) comply with all requirements from school districts in preparing written proposals for survey clearance; (6) convey a willingness to appear in person, if needed, to present the survey before a school board, research committee, or other local entity tasked with reviewing the survey; (7) offer a package of educational products to each participating school, as recommended and approved by OMB in approving the 1998 YRBS in alternative schools (OMB No. 0920-0416, expiration 12/98), and continued ever since; and (8) offer schools a monetary incentive of $500, consistent with recommendations OMB has previously made, and implemented in the national YRBS since 2001.


The sampling plan does not allow for the replacement of schools that refuse to participate due to concern that replacing schools would introduce bias. All participating SEAs, school districts, and schools also will be promised and sent a copy of the published survey results.


Maximizing responses and dealing with refusals from parents, teachers, and students require different strategies. Parental permission form reminders (F4 & F5) will be sent to parents who have not returned parental permission forms within an agreed upon time period (e.g., 3 days); those who do not respond to the reminder will be sent a second and final reminder. The permission form will provide a telephone number at CDC that parents may call to have questions answered before agreeing to give permission for their child's participation. Permission forms will be available in English, Spanish, and other languages as required by dominant languages spoken by parents in selected schools. Field staff will be available on location to answer questions from parents who remain uncertain of permission. Bilingual field staff will be used in locations with high Hispanic concentrations (e.g., California, Florida, New York City, and Texas).


Teacher refusals to cooperate with the study are not expected to be a problem because schools already will have agreed to participate and burden to teachers is minimal.


Refusals by students whose parents have consented also are expected to be minimal. No punitive action will be taken against a nonconsenting student. Nonconsenting students will not be replaced. Data will be analyzed to determine if student nonresponse introduces any biases.


To minimize the likelihood of missing values on the questionnaire, students will be reminded in writing in the questionnaire booklet and verbally by the survey administrator to review the optically scannable questionnaire before turning it in to verify that: (1) each question has been answered, (2) only one oval is filled in for each question with the exception of the question on race/ethnicity, and (3) each response has been entered with a No. 2 pencil, fills the oval, and is dark. A No. 2 pencil will be provided to each survey participant to reduce the likelihood that responses will not scan properly, which would produce missing values. In addition, when completed questionnaires are visually scanned later at project headquarters, any oval that is lightly filled in will be darkened (unless they appear to be erasures) and stray marks will be erased before the forms are scanned. Missing values for an individual student on the survey will not be imputed.


B.4 TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN

YRBS questionnaire items were originally tested by the NCHS laboratories. The 1993 special issue of Public Health Reports on the development of the Youth Risk Behavior Surveillance System describes the development and testing process. A limited pretest of the questionnaire on nine respondents was conducted in November 1989 by the contractor in the Prince George's County, Maryland school system in accord with OMB guidelines. The pretest was conducted to:

  • Quantify respondent burden.

  • Test survey administrator instructions and procedures.

  • Verify the overall feasibility of the survey approach.

  • Identify needed changes in the instruments or instructions to control/reduce burden.

The pilot test sharpened the articulation of certain survey questions and produced an empirical estimate of the survey burden.

The YRBS questionnaire has been used extensively in ten prior national school-based surveys approved by OMB, as well as at the state and local levels. Further pilot testing in accord with OMB guidelines has been performed on all new questions.


B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS AND INDIVIDUALS COLLECTING AND/OR ANALYZING DATA

B.5.a Statistical Review

Statistical aspects of the study have been reviewed by the individuals listed below.

  • Michael T. Errecart, Ph.D. (deceased)

Macro International Inc.


  • Ronaldo Iachan, Ph.D.

Macro International Inc.

11785 Beltsville Drive, Suite 300

Beltsville, MD 20705

Phone: (301) 572-0538

Fax: (301) 572-0986

E-mail: [email protected]


B.5.b Agency Responsibility

Within the agency, the following individual will be responsible for receiving and approving contract deliverables and will have primary responsibility for data analysis:


  • Danice K. Eaton, MPH, Ph.D.

Lieutenant Commander
United States Public Health Service
SERB/DASH/NCCDPHP/CDC
4770 Buford Highway NE, MS K-33
Atlanta, GA  30341
Voice: 770-488-6143
Fax: 770-488-6156
E-mail: [email protected]


B.5.c Responsibility for Data Collection


The representative of the contractor responsible for conducting the planned data collection is:


  • Katherine H. Flint, M.A.

Senior Technical Director

Macro International Inc.

11785 Beltsville Drive, Suite 300

Beltsville, Maryland 20705

Phone: (301) 572-0333

Fax: (301) 572-0986

E-mail: [email protected]

REFERENCES


CDC. (1994). Guidelines for school health programs to prevent tobacco use and addiction. MMWR;43(No. RR‑2).


CDC. (1996). Guidelines for school health programs to promote lifelong healthy eating. MMWR 1996;45(No. RR‑9).


CDC. (1997). Guidelines for school and community programs to promote lifelong physical activity among young people. MMWR;46(No. RR‑6).


CDC. (2001). School health guidelines to prevent unintentional injuries and violence. MMWR;50(No. RR-22).


CDC. (2003). Tobacco, alcohol, and other drug use among high school students in Bureau of Indian Affairs-funded schools. MMWR;52(44): 1070-1072.


CDC. (2004). Medical expenditures attributable to injuries - United States, 2000. MMWR;53(No.1):1‑4.


CDC. (2005a). Annual smoking-attributable mortality, years of potential life lost, and productivity losses – United States, 1997-2001. MMWR;54(25):625-628.


CDC. (2005b). School Health Index: A Self-Assessment and Planning Guide. Atlanta, Georgia.


CDC. (2006). Physical Education Curriculum Analysis Tool. Atlanta, Georgia.


CDC. (2007a). HIV/AIDS Surveillance Report, 2005. Vol. 17. Rev ed. Atlanta: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention.


CDC. (2007b). Best Practices for Comprehensive Tobacco Control Programs—2007. Atlanta: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health.


CDC. (2007c). Health Education Curriculum Analysis Tool. Atlanta: CDC.


CDC, Council of State and Territorial Epidemiologists, and the Association of State and Territorial Chronic Disease Program Directors. (2004). Indicators for chronic disease surveillance. MMWR;53(RR11): 1-6.


CDC; Health Resources and Services Administration; National Adolescent Health Information Center, University of California, San Francisco. (2004.) Improving the Health of Adolescents & Young Adults: A Guide for States and Communities. Atlanta, GA.

CDC, National Center for Health Statistics. (2007). Public use data file and documentation: Multiple cause of death for ICD-10 2004 data [CD-ROM].

Chesson HW, Blandford JM, Gift TL, Tao G, and Irwin KL. (2004). The estimated direct medical cost of sexually transmitted diseases among American youth, 2000. Perspectives on Sexual and Reproductive Health; 36(1):11-19.


Child Trends. (2001). Fact Sheet: Trends Among Hispanic Children, Youth and Families. Accessed at http://www.childtrends.org/Files//Child_Trends-2001_03_13_FS_TrendsHispanic.pdf.


Colditz G. Economic costs of obesity and inactivity. (1999). Medicine and Science in Sports and Exercise;31(11):S663 -S667.


Dinkes R, Cataldi EF, and Lin-Kelly W. (2007). Indicators of School Crime and Safety: 2007 (NCES 2008-021/NCJ 219553). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education, and Bureau of Justice Statistics, Office of Justice Programs, U.S. Department of Justice. Washington, DC.


Eaton DK, Kann L, Kinchen S, Ross J, Hawkins J, Harris WA, et al. (2006). Youth Risk Behavior Surveillance—United States, 2005. MMWR;55(SS-5):1-108.


Finkelstein EA, Fiebelkorn IC, Wang G. (2003). National medical spending attributable to overweight and obesity: How much, and who’s paying? Health Affairs;W3;219-226.


Federal Interagency Forum on Child and Family Statistics. (2007) America's Children: Key National Indicators of Well-Being, 2007. Federal Interagency Forum on Child and Family Statistics, Washington, DC: U.S. Government Printing Office.


Fox MA, Connolly BA, and Snyder TD. (2005).Youth Indicators 2005: Trends in the Well-Being of American Youth. U.S. Department of Education, National Center for Education Statistics. Washington, DC: U.S. Government Printing Office (NCES 2005–050).


Frazão E. (Ed). (1999). America's Eating Habits: Changes and Consequences. Food and Rural Economics Division, Economic Research Service, U.S. Department of Agriculture. Agriculture Information Bulletin No. 750 (AIB‑750), p. 5-32.


Harwood H. (2000). Updating Estimates of the Economic Costs of Alcohol Abuse in the United States: Estimates, Update Methods and Data. Report prepared by the Lewin Group for the National Institute on Alcohol Abuse and Alcoholism.


Hoffman SD. (2006). By the Numbers: The Public Costs of Teen Childbearing. Washington, DC: National Campaign to Prevent Teen Pregnancy.


Kaiser Family Foundation. (1998). Sexually Transmitted Diseases in America: How Many Cases and at What Cost? Report prepared by the American School Health Association.


Max W. (2001). The financial impact of smoking on health-related costs: a review of the literature. American Journal of Health Promotion;15:321--31.


National Center for Health Statistics. (2007). Health, United States, 2007. With Chartbook on Trends in the Health of Americans. Hyattsville, Maryland.


National Coalition for Food-Safe Schools. (2004). Food-Safe Schools Action Guide. Available at: http://www.foodsafeschools.org/.


National Institute on Drug Abuse. (June 2006). Evaluation of the National Youth Anti-Drug Media Campaign: 2004 Report of Findings. Available on-line: http://www.drugabuse.gov/DESPR/Westat/.


Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services. (2004). Trends in the Well-being of America’s Children and Youth, 2003. Available on-line: http://aspe.hhs.gov/hsp/03trends/index.htm.


Office of National Drug Control Policy. (2004). The economic costs of drug abuse in the United States, 1992-2002. Washington, DC: Executive Office of the President, 2004. (Publication No. 207303).


Shaughnessy L, Doshi SR, Everett Jones S. (2003). Attempted suicide and other associated health risk behaviors among Native American high school students. Presented at the American Public Health Association National Conference.


Snyder HN and Sickmund M. (2006.) Juvenile Offenders and Victims: 2006 National Report. Washington, DC: U.S. Department of Justice, Office of Justice Programs, Office of Juvenile Justice and Delinquency Prevention.

Sussman MP, Jones SE, Wilson TW, and Kann L. (2002). The Youth Risk Behavior Surveillance System: updating policy and program applications.
Journal of School Health;72(1): 13-17.


U.S. Department of Agriculture, U.S. Department of Health and Human Services, U.S. Department of Health and Human Services; and U.S. Department of Education. (January 2005). Making It Happen! School Nutrition Success Stories. Alexandria, VA.


U.S. Department of Health and Human Services. (1994). Preventing tobacco use among young people: a report of the Surgeon General. Atlanta GA.


U.S. Department of Health and Human Services. (1996). Physical activity and health: a report of the Surgeon General. Atlanta GA.


U.S. Department of Health and Human Services. (1998). Tobacco use among U.S. racial/ethnic minority groups: A report of the Surgeon General. Atlanta GA.


U.S. Department of Health and Human Services. (1999). Mental health: a report of the Surgeon General. Atlanta GA.


U.S. Department of Health and Human Services. (1999). The Surgeon General’s call to action to prevent suicide. Atlanta GA.


U.S. Department of Health and Human Services. (2000.) Healthy People 2010. 2nd ed. With Understanding and Improving Health and Objectives for Improving Health. 2 vols. Washington, DC: U.S. Government Printing Office.


U.S. Department of Health and Human Services. (2001). Women and smoking: a report of the Surgeon General. Atlanta GA.


U.S. Department of Health and Human Services. (2001). Youth violence: a report of the Surgeon General. Atlanta GA.


U.S. Department of Health and Human Services. (2004). The health consequences of smoking: a report of the Surgeon General. Atlanta GA.


U.S. Department of Health and Human Services, Health Resources and Services Administration, Maternal and Child Health Bureau. (2006). Child Health USA 2006. Rockville, Maryland: U.S. Department of Health and Human Services.


Urban Institute. (2000). Teen Risk-Taking: A Statistical Portrait. Washington, DC.


Ventura SJ, Abma JC, Mosher WD, Henshaw SK. (December 2006). Recent trends in teenage pregnancy in the United States, 1990-2002. Health E-stats. Hyattsville, MD: National Center for Health Statistics.


Weinstock H, Berman S, Cates W. (2004). Sexually transmitted disease among American youth: Incidence and prevalence estimates, 2000. Perspect Sex Reproduction Health;36(1):6-10.



17



File Typeapplication/msword
Authorkflint
Last Modified Byarp5
File Modified2008-04-09
File Created2008-01-04

© 2024 OMB.report | Privacy Policy