07am_yrbs_ss_091407_part B

07AM_YRBS_SS_091407_PART B.doc

Study to Examine Web-based Administration of the Youth Risk Behavior Survey

OMB: 0920-0763

Document [doc]
Download: doc | pdf


SUPPORTING STATEMENT FOR THE

STUDY TO EXAMINE WEB-BASED ADMINISTRATION OF THE

YOUTH RISK BEHAVIOR SURVEY



PART B

Submitted by:

Danice K. Eaton, MPH, PhD, Project Officer

Division of Adolescent and School Health

National Center for Chronic Disease Prevention and Health Promotion

4770 Buford Hwy, NE, MS K-33
Atlanta, GA 30341
770-488-6143 (voice); 770-488-6156 (fax)

[email protected]

Centers for Disease Control and Prevention

Department of Health and Human Services

September 2007

TABLE OF CONTENTS



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Respondent Universe and Sampling Methods

2. Procedures for the Collection of Information


a. Statistical Methodology for Stratification and Sample Selection

b. Estimation and Justification of Sample Size

c. Estimation and Statistical Testing Procedures

d. Use of Less Frequent than Annual Data Collection

e. Survey Instrument

f. Data Collection Procedures

g. Obtaining Access to and Support from Schools

h. Informed Consent

i. Quality Control


3. Methods to Maximize Response Rates and Deal with Nonresponse

a. Expected Response Rates

b. Methods for Maximizing Response and Handling Non-Response


4. Tests of Procedures or Methods to be Undertaken


5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or

Analyzing Data


a. Statistical Review

b. Agency Responsibility

c. Responsibility for Data Collection


REFERENCES


LIST OF APPENDICES

  1. Authorizing Legislation

  2. 60-Day Federal Register Notice

  3. 60-Day Federal Register Notice Comment


D. Data Collection Instrument for Principal – “Principal Survey of the Feasibility and Acceptability of Web-based Student Assessments and Surveys”


E. Data Collection Instrument for Students – “Student Health Survey”

E1. “Student Health Survey” Without Skip Patterns

E2. “Student Health Survey” With Skip Patterns


F. Data Collection Instrument for Principals Supplemental Documents

F1. Letter of Invitation

F2. Consent Form


G. Data Collection Instrument for Students (“Student Health Survey”) Supplemental Documents

G1. Parental Permission Form Distribution Script

G2. Parental Permission Form and Fact Sheet (English Version)

G3. Parental Permission Form and Fact Sheet (Spanish Version)

G4. Parental Permission Form Reminder Notice (English Version)

G5. Parental Permission Form Reminder Notice (Spanish Version)

G6. Questionnaire Administration Guides

G7. Data Collector Confidentiality Agreement

H. School Recruitment Script for the “Student Health Survey”


I. School Recruitment Script for the “Student Health Survey” Supplemental Documents

F1. School Letter of Invitation and Fact Sheet

F2. Letter to Agreeing Schools


J. Data Collection Checklist for the “Student Health Survey”


K. Data Collection Checklist for the “Student Health Survey” Supplemental Documents

K1. Student Questionnaire Letter to Teachers in Participating Schools, Paper-and-Pencil or Computer Lab Conditions

K2. Student Questionnaire Letter to Teachers in Participating Schools, “On-Their-Own” Condition

K3. Make-up List and Instructions


L. 2008 Methodological Study IRB Approval Letter


M. 2008 Methodological Study Table Shells


N. Detailed Sampling and Weighting Plan for the Principal Data Collection


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


B.1      RESPONDENT UNIVERSE AND SAMPLING METHODS

 

The proposed study includes a principal data collection with a nationally representative sample of principals and a student data collection with a convenience sample of students. The respondent universe for the 2008 methodological study is the universe of all private and public high schools nationwide (i.e., in the 50 states and the District of Columbia), their principals, and their students. The sampling frame for schools has been obtained from Quality Education Data (QED), Inc. QED data encompass both private and public schools and include the latest data from the Common Core of Data from the National Center for Education Statistics. Data on enrollments by grade and minority enrollments at the school level are available in this dataset. Table B.1 displays the current national distribution of high schools by metropolitan status and school type using the three types in the QED database: Catholic schools, non-Catholic private schools, and public schools.


Table B.1. Description of High Schools by Metropolitan Status and School Type for Sample Selection


Metropolitan Status

School Type

Total

Frequency
Percent
Row Pct
Col Pct

Catholic

Private

Public

Unclassified

1
0.00
1.72
0.08

38
0.14
65.52
0.63

19
0.07
32.76
0.09

58
0.21


Urban

577
2.11
9.11
45.87

1,805
6.59
28.48
29.92

3,955
14.45
62.41
19.69

6,337
23.15


Suburban

601
2.20
5.23
47.77

3,032
11.08
26.37
50.27

7,866
28.74
68.41
39.17

11,499
42.01


Rural

79
0.29
0.83
6.28

1,157
4.23
12.21
19.18

8,242
30.11
86.96
41.04

9,478
34.63


Total

1,258
4.60

6,032
22.04

20,082
73.37

27,372
100.00


For the principal data collection, a questionnaire will be administered to a national probability sample of approximately 750 high school principals. The sampling plan to be used for the principal data collection is detailed in Appendix N (Detailed Sampling and Weighting Plan for Principal Data Collection).


For the student data collection, a convenience sample of approximately 80 high schools that is balanced in terms of geographic dispersion, race/ethnicity of students served, and metropolitan status will be asked to participate.

 

B.2   PROCEDURES FOR THE COLLECTION OF INFORMATION


B.2.a Statistical Methodology for Stratification and Sample Selection

The sample for the principal data collection will consist of approximately 750 principals representing a combination of public and private schools. Principals will be selected following a sampling plan designed to produce a sample that is representative of principals of schools containing grades 9 through 12 nationally. The sampling plan for the principal data collection is included in Appendix N. This design includes the key features summarized in Table B.2.a.


Table B.2.a. Summary Features of Principal Data Collection Sampling Design


Sampling Stage

Sampling Units

Sample Size (Approximate)

Stratification

Measure of Size

1

Counties or groups of counties

225 primary sampling units (PSUs)

Urban vs. non-urban (2 strata)

Minority concentration (8 strata)

Aggregate School Size in Target Grades

2

Schools

750 school selections (>=3 per PSU)

Small vs. other

Weighted enrollment (increased for minority groups)


The sample for the student data collection will consist of approximately 8,000 students enrolled in approximately 80 schools. In each of 80 schools, four classes will be selected randomly from a list of all available sections of a course required at grades 9 or 10; then, the four classes will be assigned randomly to one of the four conditions: 1) paper-and-pencil questionnaire in regular classroom, 2) web-based questionnaire in computer lab without programmed skip patterns, 3) web-based questionnaire in computer lab with programmed skip patterns, and 4) web-based questionnaire completed at any computer of the student’s choosing (i.e. “on your own” condition) without programmed skip patterns. A Latin Square design will be used to ensure a balance in random assignment to condition across all 80 schools.


B.2.b Estimation and Justification of Sample Size


For the principal data collection, we anticipate an 80% participation rate, resulting in participation by approximately 600 principals out of 750 selected principals. The sample size of 600 participating principals will be sufficiently large to support estimates with a precision level of 0.05 or better at the 95% confidence level. This section describes the derivation of these sample sizes to achieve the target precision levels.


Design effects, defined as the variance under the actual sampling design divided by the variance that would be attained under a simple random sample of the same size, are expected to be greater than 1.0 for the principal sample due to clustering at sampling stage 1 and unequal weighting effects based on enrollment of minority group. The anticipated DEFF is between 1.5 and 2.0 as these variance-inflating effects are compensated to some extent by the variance-reducing benefits of stratification.


Table B.2.b-1 presents the standard error and confidence intervals (half-width) for estimated proportions based on sample sizes of n=400 and n=600 participants using DEFF=1.6. This table shows that the standard error is at most 2.6 percentage points for n=600 principals, and that the desired confidence levels are achieved (i.e., intervals within +/- 5%) for these sample sizes.


Table B.2.b-1. Precision Expected for Estimated Percentages Based on Different Sample Size Scenarios: Standard Errors and 95% Confidence Intervals



School Principal Sample Size (DEFF=1.6)

Estimated proportions

N=400

N=600


Standard error

Confidence intervals

Standard error

Confidence intervals

5%

1.4%

2.7%

1.1%

2.2%

10%

1.9%

3.7%

1.5%

3.0%

15%

2.3%

4.4%

1.8%

3.6%

20%

2.5%

5.0%

2.1%

4.0%

50%

3.2%

6.2%

2.6%

5.0%


For the student data collection, we anticipate a 75% mean participation rate across conditions, resulting in participation by approximately 6,000 students out of 8,000 selected students. The anticipated sample size for the student data collection will be sufficient to support all required comparisons of prevalence rates across conditions at precision levels of 0.05 at the 95% confidence level. Similarly, sample sizes will also support comparisons across experimental groups of the other outcomes of interest, primarily participation rates.


Table B.2.b-2 shows the precision expected for subgroup differences in prevalence rates. In this context, the subgroups are the sets of students assigned to each of the four different conditions, each with n=1,500 students. This table presents differences between two subgroup percentages, P(1) and P(2), assuming a design effect (DEFF) equal to 2.0 to take into account the expected clustering effects of students within classes.


Table B.2.b-2 Precision Expected for Estimated Differences between Percentages (DEFF=2)



P(2)

P(1)

10%

20%

30%

40%

50%

10%

1.0%

1.2%

1.3%

1.4%

1.4%

20%

1.2%

1.3%

1.4%

1.5%

1.5%

30%

1.3%

1.4%

1.5%

1.6%

1.6%

40%

1.4%

1.5%

1.6%

1.6%

1.6%

50%

1.4%

1.5%

1.6%

1.6%

1.7%


B.2.c Estimation and Statistical Testing Procedures


The principal data collection will generate national weighted estimates for school principals. Weighting procedures are detailed in Appendix N and summarized below. The base weight for each sampled principal will be equal to the inverse of his/her probability of selection. Final weights will reflect the probability of selection and non-response adjustments; these weights will be appropriate for national estimates and estimates within strata.


The analytic focus of the student data collection is the comparison of the effects across the four study conditions. The comparisons to be performed include:


  1. risk behavior prevalence rates by condition;

  2. student participation rates by condition;

  3. risk behavior prevalence rates by group administration (conditions 1, 2, and 3) versus individual administration (condition 4);

  4. risk behavior prevalence rates by use of skip patterns (condition 3) versus no skip patterns (conditions 1, 2, and 4);

  5. questionnaire completion rates by web-based (conditions 2, 3, and 4) versus paper-and-pencil condition (condition 1)


The estimation process for both the principal and student data collections will use statistical software developed for analyses of survey data arising from complex sampling designs (e.g., SUDAAN). These estimation procedures will appropriately account for the effects of non-response, unequal probability sampling, stratification, and clustering.


B.2.d Use of Less Frequent than Annual Data Collection


This study will be conducted once. Respondents will be asked to respond only once.


B.2.e Survey Instrument

The principal data collection questionnaire – “Principal Survey of the Feasibility and Acceptability of Web-based Student Assessments and Surveys” (Appendix D) contains 22 items and has been developed specifically to complement and extend the findings from the student data collection. The questionnaire can be divided into roughly 4 sections. The first 9 questions assess the principal’s years of experience and computer resources and internet capability in the school. Two questions assess the preferred mode of data collection for student assessments and surveys. Seven questions assess perceived benefits and barriers to online data collection compared to paper-and-pencil data collection. Four questions assess whether online data collection methods have ever been used at the school and the extent to which problems with online data collection occurred. The questions are all in either multiple-choice or fill in the blank format. Principals will be offered the option of responding either on optically-scannable questionnaire booklets or via a web-based questionnaire.


The student data collection questionnaire – “Student Health Survey” (Appendices E1 or E2) contains 92 questions. A version of the instrument with skip patterns is located in Appendix E2. The instrument can be divided into nine sections. The first 5 questions are demographic items. Most of the remaining questions address health-risk behaviors in six topic areas: unintentional injuries and violence; tobacco use; alcohol and other drug use; sexual behaviors that contribute to HIV infection, other sexually transmitted diseases and unintended pregnancies; unhealthy dietary behaviors; and physical inactivity. Two questions assess student absenteeism. The final section consists of thirteen questions assessing physical aspects of privacy, perceived privacy and anonymity, and experience with computers, and four questions assessing the setting in which the questionnaire was completed. The questions are all in a multiple-choice format and will be administered either as an 8-page optically scannable questionnaire booklet or as web-based questionnaire completed on an internet-connected computer.


B.2.f Data Collection Procedures


The principal data collection questionnaire – “Principal Survey of the Feasibility and Acceptability of Web-based Student Assessments and Surveys” (Appendix D) will be administered using a mixed paper/web-based mode of data collection. A mixed mode is designed to accommodate respondent’s preferences for either a web-based or paper format. An invitation letter (Appendix F1) will be mailed to principals. The invitation letter offers principals the option of responding via the web or using a paper questionnaire enclosed with the invitation. In the body of the letter, a unique identifying number for accessing a web-based questionnaire is provided. The letter also transmits the paper questionnaire (Appendix D), a consent form (Appendix F2), and a business reply envelope for use in returning the completed questionnaire. Approximately a week after the principal questionnaire has arrived, we will contact via e-mail all principals whose email addresses we are able to obtain. The e-mail will thank those who have responded, transmit the unique identifying number in the body of the e-mail, and again convey that principals may respond using the mode of their choice, either via the web or on paper. A toll-free number will be made available to principals who have questions about the study, need additional information, or encounter technical problems in completing the web-based questionnaire.


We will keep track of who has and has not responded to the principal data collection by monitoring the unique identifying numbers on the returned paper questionnaires and used to access the web-based questionnaire. Principals who break-off before completing the web-based questionnaire may re-access the questionnaire by using the original unique identifying number. When they re-access the questionnaire, they will be returned to the point at which they broke off. By monitoring returns via paper and web-based questionnaires, we will identify principals in need of a reminder.


The student data collection questionnaire – “Student Health Survey” (Appendices E1 or E2) will be administered by a small staff of professional data collectors specially trained for this study. The data collectors will have direct responsibility for administering the questionnaire to students in three of the four conditions; in the fourth condition (the “on your own” condition), they will meet students in the classroom, provide participating students with their unique identifying number, explain how to access the questionnaire, encourage participation, and return to notify students of their class’s participation rate, but will not administer the questionnaire. Data collectors will have at their disposal a questionnaire administration guide (Appendix G6) to be followed for each of the four study conditions: 1) paper-and-pencil questionnaire in regular classroom, 2) web-based questionnaire in computer lab without programmed skip patterns, 3) web-based questionnaire in computer lab with programmed skip patterns, and 4) web-based questionnaire completed “on their own” at any computer of the student’s choosing. The questionnaire administration guide details for the data collector the steps to be followed in administering the questionnaire for each specific condition, including the instructions to be read to students before they begin completing the questionnaire. For group administrations, except when required by law or prevailing local practice, the teacher will not be in the room during questionnaire administration, but will remain nearby (e.g., in the hall) in case discipline problems arise. The only direct responsibility of teachers in data collection is to distribute and follow-up on parental permission forms prior to the scheduled date for data collection in the school. Teachers will be asked to remain outside the room where data collection takes place to increase the candor and comfort level of students. However, for administrations in the computer lab, whoever would normally be present to facilitate web-based testing or assessment, and for addressing technology problems, will be asked to remain present. In general, our data collection procedures have been designed to ensure that:


  • Everyday school activity schedules are disrupted minimally.

  • Administrative burden placed on teachers is minimal.

  • Parents give informed permission for their child to participate in the student data collection.

  • Anonymity of student participation and a continued sense of privacy are maintained, with no punitive actions against nonparticipants.

  • Alternative activities are provided for school-based nonparticipants.

  • Control over the quality of data is maintained.


The Data Collection Checklist (Appendix J) is completed by teachers to track which students have received parental permission to participate in the data collection. The Data Collection Checklist is given to the study data collector on the day of questionnaire administration. Following data collection, the Data Collection Checklist is destroyed.


B.2.g Obtaining Access to and Support from Schools


All initial letters of invitation will be on CDC letterhead from the Department of Health and Human Services and signed by Howell Wechsler, Ed.D., MPH, Director, DASH, NCCDPHP at CDC. The procedures for gaining access to schools for the student data collection will have three major steps:


  • Contact representatives of selected state education agencies (SEAs) to identify prospective schools. Obtain names of school districts in which schools are located and names of supportive school district contacts. Request that the state notify the school districts that they may anticipate being contacted about the study.


  • Once identified by the SEAs, invite school districts in which selected schools are located to participate in the study. Obtain written approval for participation at the district level. Request that the school district notify schools that they may anticipate being contacted about the study. Request general guidance on working with the selected schools.


  • Once cleared at the school district level, invite selected schools to participate. Work with a school administrator (e.g., principal or secretary) who serves as the school contact for the data collection to verify information previously obtained about the school. Present the burden and benefits of participation in the study. Confirm that interested schools have appropriate technology to support the two group, web-based administrations intended for computer labs. After a school meeting technology requirements agrees to participate, come to agreement on a required subject from which to select classes, obtain lists of class sections, select classes, and agree upon an approximate timeframe for initiating data collection at the school. Obtain written approval for participation at the school level. Ensure that parental permission forms reach the school and selected classes well in advance of when needed. Maintain contact with schools until all data collection activities have been completed.


Scripts to guide discussions with school-level contacts (Student Questionnaire School Recruitment Script, Appendix H) are provided. Within each school, the school administrator (e.g., principal or secretary) who serves as the school contact for the study will receive a letter of invitation to participate in the student questionnaire (School Letter of Invitation and Fact Sheet, Appendix I1). Once a school agrees to allow their students to participate, a letter to agreeing schools (Appendix I2) will be sent to thank them and to provide more information about the study. Teachers of selected classrooms will receive a letter (Letter to Teachers in Participating Schools, Appendices K1 & K2) thanking them for allowing their class to participate, providing information about the study, and giving instructions on distributing and tracking the return of parental permission forms.


B.2.h Informed Consent


For the principal data collection, a consent form (Appendix F2) will be included with the invitation letter (Appendix F1) that will be mailed to selected principals. The consent form informs the principal of the voluntary nature of the data collection and, like most consent forms in low-risk studies involving adults, indicates that the principal’s act of completing the questionnaire connotes his or her consent. The principal is asked to keep the consent form for future use in case he or she has any follow-up questions for CDC or the contractor.


For the student data collection, parental permission will be obtained using the parental permission form (Appendix G2 & G3), The parental permission form informs both the student and the parent about an important activity in which the student has the opportunity to participate. By providing adequate information about the activity, it ensures permission will be informed. In compliance with requirement of the No Child Left Behind Act, the parental permission form specifies that questionnaires will be available for review by parents at all schools. The parental permission forms will be made available in both English and Spanish.


B.2.i Quality Control


The task of collecting quality data begins with a clear and explicit study protocol and ends with procedures for the coding, entry, and verification of collected data. In between these activities, and subsequent to data collector training, measures must be taken to reinforce training, to assist data collectors who run into trouble, and to check on data collection techniques. Because the ultimate aim is production of a high quality database and reports, various quality assurance activities will be applied during the data collection phase. Table B.2.I lists the major means of quality control.

Table B.2.I. Major Means of Quality Control


Study Step

Quality Control Procedures

Questionnaire Programming

  • Conduct internal programming review of web-based questionnaire to ensure accuracy (100 percent)

  • Conduct pretest of web-based questionnaire to ensure appropriate capture of data (100 percent)


Study Preparation Procedures

  • Conduct practice questions with each participant prior to web-based questionnaire administration to ensure each participant understands how to use a computer and enter information into the questionnaire (100 percent)

  • Provide each participant with a “cheat sheet” for navigating through the web-based questionnaire (100 percent)


Protocol Validation

  • Interview school contact to ensure data collectors had proper demeanor (10 percent)

  • Monitor data collectors to ensure study is conducted according to protocol. Conduct refresher training for data collectors who experience difficulties (5 percent)


Receipt Control

  • Verify that collected data are submitted in a timely fashion and stored in a secure, non-network location (100 percent)


Data Control

  • Examine and flag cases that reflect extensive refusal and/or misinformation supplied by the respondent (100 percent)



B.3 METHODS TO MAXIMIZE RESPONSE RATES AND DEAL WITH NONRESPONSE


B.3.a Expected Response Rates


For the principal data collection, we anticipate a participation rate of 80% based on experience with similar studies. For the student data collection, we anticipate a student participation rate of 75%. The mean student participation rate for the previous National YRBSs (OMB number 0920-0493; expiration 11/07) conducted over the past decade and for the methodological studies conducted in 2000 and 2002 was 87%. For the student data collection, we expect a slightly lower mean student response rate because of the inclusion of the “on your own” condition, which likely will depress levels of student participation. Because this is a feasibility study, equal effort will be made across the four conditions to induce parental permission and student participation. Therefore, if one or more conditions result in response rates that fall short of expectations and are significantly lower than experienced in other conditions, this will not represent a failure in execution of the study design. To the contrary, this will represent important data in assessing the feasibility of administering web-based surveys of risk behaviors among high school students across a range of conditions

B.3.b Methods for Maximizing Response and Handling Non-Response


Several methods will be used to maximize responses to the principal and student questionnaires.


To minimize refusals in the principal data collection, we will emphasize the importance of the study. All participating principals will be promised and sent a copy of the published study results. In addition, we will work with constituency groups like the National Association of Secondary School Principals to convey the importance of the study and recommend participation. The questionnaire will be made available both on paper and online, with the online questionnaire accessible using a unique identifying number. We will convey the invitation to principals initially on paper and then via e-mail. Reminders will be sent to all principals both by mail and e-mail without regard for the status of questionnaire return. Returns will be monitored by recording receipt of responses by mail and online. A second round of reminders will be sent to those who have not responded to previous invitations. Those not responding to the second reminder will be called by phone to confirm that the invitation has been received and there are no barriers to response. We anticipate that in most cases principals will have delegated the questionnaire to another administrator and will not have realized the questionnaire had not been submitted. The letter of invitation (Appendix F1) and consent form (Appendix F2) will provide telephone numbers at CDC and at the contractor’s offices that principals may call to have questions answered or obtain assistance in responding. A toll-free line will be available to provide support with on-line responses. In addition, a $50 bookstore gift certificate will be offered as an incentive to all principals who complete the questionnaire. No punitive action will be taken against nonconsenting principals. Nonconsenting principals will not be replaced. Data will be analyzed to determine if principal nonresponse introduces any biases.

For the student data collection, all participating schools will receive a $500 incentive for participation and will be promised and sent a copy of the published study results. Student nonresponse may occur if a parent refuses permission for their child to participate or if the student refuses to participate. Refusals by students or parents are expected to be minimal since historically they have been minimal on similar studies. However, procedures to minimize refusals will be recommended to schools, including advertising the study through the principal's newsletter, PTA meetings, and other established means of communication. Parental permission forms will be provided in English (Appendix G2) and Spanish (Appendix G3). Parental permission form reminder notices (Appendices G4 & G5) with a second copy of the permission form will be sent to parents who have not returned parental permission forms within an agreed upon time period (e.g., 3 days). The permission form will provide telephone numbers at CDC and at the contractor’s offices that parents may call to have questions answered before agreeing to give permission for their child's participation. Data collectors will be available on location to answer questions from parents who remain uncertain of permission. With the cooperation of schools, data collectors will make telephone calls to parents who have expressed concerns or have questions. A toll-free hotline will be available to students who have questions about the study or who encounter technical difficulties with completing the web-based questionnaire on their own. Make-up sessions will be held for eligible students in the three group administration conditions who were absent at the original administration, had not yet obtained parental permission, or who for other reasons were unable to complete the questionnaire at the original administration (Make-up List and Instructions, Appendix K3). No punitive action will be taken against a nonconsenting student. Nonconsenting students will not be replaced. Data will be analyzed to determine if student nonresponse introduces any biases.


B.4 TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN


The principal data collection was developed specifically for the 2008 methodological study. The contractor conducted a limited pretest of a paper version of the principal questionnaire in Prince George’s County, Maryland in Spring, 2007. The pretest, conducted face-to-face and by telephone, involved nine principals regarded as diverse in terms of metropolitan status and community socioeconomic characteristics. The pretest resulted in improvements in both the clarity and user relevance of questions and a slight reduction in burden. All nine principals completed the questionnaire in less than the estimated 25 minutes.


The questionnaire to be used for the student data collection (“Student Health Survey, Appendices E1 or E2) is similar to the National YRBS questionnaire (OMB No.: 6834, expiration 11/07) which has been used extensively in ten prior national school-based surveys approved by OMB and the questionnaire from the 2004 methodological study (OMB No.: 0920-0611, expiration 12/04). Based on previous experience, students will be able to complete the questionnaire in less than the estimated 45 minutes.

B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS AND INDIVIDUALS COLLECTING AND/OR ANALYZING DATA


B.5.a Statistical Review


Statistical aspects of the study have been reviewed by the individuals listed below:


  • Maxine Denniston, MSPH

Division of Adolescent and School Health

Centers for Disease Control and Prevention

4770 Buford Hwy., NE

Atlanta, Georgia 30341

(770) 488-6212

[email protected]


  • Ronaldo Iachan, PhD

Macro International Inc.

11785 Beltsville Drive, Suite 300

Calverton, MD 20705

(301) 572-0538

[email protected]


B.5.b Agency Responsibility


Within the agency, the following individuals will be responsible for receiving and approving contract deliverables and will have primary responsibility for data analysis:

Nancy Brener, PhD

Division of Adolescent and School Health

Centers for Disease Control and Prevention

4770 Buford Hwy., NE

Atlanta, Georgia 30341

(770) 488-6184

[email protected]


Danice Eaton, PhD

Division of Adolescent and School Health

Centers for Disease Control and Prevention

4770 Buford Hwy., NE

Atlanta, Georgia 30341

(770) 488-6143

[email protected]


B.5.c Responsibility for Data Collection


The representative of the contractor responsible for conducting the planned data collection is:


James G. Ross

Macro International Inc.

11785 Beltsville Drive

Calverton, Maryland 20705

(301) 572-0208

[email protected]

REFERENCES


Brener, ND, Eaton, DK, Kann, L, Grunbaum, JA; Gross, LA, Kyle, TM, Ross, JG. The association of survey setting and mode with self-reported health risk behaviors among high school students. Public Opinion Quarterly 2006; 70: 354-374.

Brener ND, Grunbaum JA, Kann L, McManus T, Ross J. Assessing health-risk behaviors among adolescents: the effect of question wording and appeals for honesty. Journal of Adolescent Health 2003; 35:91-100.

Brener ND, Kann L, McManus T. A comparison of two survey questions on race and ethnicity among high school students. Public Opinion Quarterly 2003;67:227-236.


Cowan CD, Coverage, Sample Design, and Weighting for Three Surveys. Journal of Drug Issues 2001:31(3).


Education Research Services. (2006). Salaries and Wages Paid Professional and Support Personnel in Public Schools, 2005-06. Education Research Services, Alexandria, VA.


Fendrich M and Johnson TP, Examining Prevalence Differences in Three National Surveys of Youth: Impact of Consent Procedures, Mode, and Editing Rules. Journal of Drug Issues 2001:31(3).

Fowler FJ, Learning From Experience: Estimating Teen Use of Alcohol, Cigarettes and Marijuana from Three Survey Protocols. Journal of Drug Issues 2001:31(3).


Harrison LD, Understanding the Differences in Youth Drug Prevalence Rates Produced by the MTF, NHSDA, and YRBS Studies. Journal of Drug Issues 2001:31(3).


Sudman S. Examining Substance Abuse Data Collection Methodologies. Journal of Drug Issues 2001:31(3).






3



File Typeapplication/msword
File TitleOMB SUPPORTING STATEMENT FOR THE 2004 METHODOLOGICAL STUDY OF THE YOUTH RISK BEHAVIOR SURVEY CONTRACT NUMBER 2001-N
Authorjross
Last Modified Byarp5
File Modified2007-09-17
File Created2007-09-17

© 2024 OMB.report | Privacy Policy