B. Statistical Methods
B.1 Respondent Universe and Sampling Methods
The sample universe includes all 2- and 4-year colleges in the United States and its territories. It will be administered to a sample of 1,400 2- and 4-year schools in proportion to their numbers in the population. With this survey sample, we expect to obtain a response rate of at least 75 percent. Similar surveys administered in 1998 and 2000, and 2005 achieved response rates of 76 percent, 74 percent, and 62%, respectively. The 2005 response rate was lower for several reasons. This was the first in the series of surveys that was conducted using a Web-based survey tool. Some administrators sought answers from multiple personnel on campus. Once they began the survey, they had difficulty re-entering and this likely caused some respondents to not respond to the survey. (Note: Macro International will simplify re-entering procedures for the 2007 administration; see section B.3 below)
A75 percent response rate is an achievable goal. This response rate would yield approximately 1,050 respondents. As an example of a dataset that could be used for a sampling frame, a recent count from the Integrated Postsecondary Education Data System (IPEDS) counted 2,703 4-year schools and 2,239 2-year schools, for a total of 4,942 schools. This would yield an initial sample of approximately 766 4-year schools and 634 2-year schools. Assuming a 75 percent response rate, that amounts to 574 4-year schools and 476 2-year schools.
It is desirable to achieve geographical diversity in the sample, as well as proportional representation for 2- and 4-year schools. To achieve this, the sample will be sorted by the following variables:
1) Levels (2-year schools versus 4-year schools)
2) Region within level, reversing the order within level
3) States within region (states sorted in random order, reversing the order within level)
4) Enrollment size, reversing the order every subsequent state
The practice of reversing the order (known as serpentine sampling) guarantees that adjacent schools will have something in common. To select the sample, survey administrators select a random number between 0 and 1 as the starter. Call this number f0, then number the schools from n1 to n4942. Associate each school with the numbers u1 to u4942, where fk = f0 + k(1,400/4,942). Now, let sk = Int(fk) - Int(fk-1) where Int(x) means the largest integer smaller than x. The sk will be 0 or 1 and the sum of the sk is precisely 1,400. If sk = 1, then uk is a sampled school.
B.2 Information Collection Procedures
With an expected 75 percent response rate, the percentage of the schools with any given characteristic may be determined with a 95 percent confidence interval of 5 percent, both for the entire population of schools and for 2- or 4-year schools separately. Also, percentages of the two groups combined can be estimated with a 90 percent confidence interval of 3 percent.
The sample size will be sufficient to detect differences in AODV prevention practices between 2-year and 4-year schools. For example, analyses of data will detect differences of 0.10 accurately in a two-sample arcsine proportion test (power = .89 alpha=.05). If one sample has a 45 percent proportion, and the other has 55 percent, a two-sided arcsine test will detect a significant difference at the .05 level about 90 percent of the time.
Periodic data collection cycles will not be used because the data need to be collected within the same time period.
B.3 Methods to Maximize Response Rates
The use of a Web-based survey instrument will help maximize response rates and will help with issues of non-response. Given the professional level of the potential respondents, easy access to a survey that does not require the additional steps of using a pen or pencil, sealing a survey in an envelope, and mailing will help increase response rates.
Follow-up activities that have been successful in maximizing the response rate will be used again, and e-mail notification will be added as an approach to reaching non-respondents. At the start, senior administrators at the sampled institutions will receive a letter notifying them of the survey, with the existing e-mail address merged into the letter, and a request for a correct e-mail address if the existing one is incorrect. A toll-free telephone number and e-mail address will be made available for these corrections. The survey will then be launched via e-mail and participants will be provided with a hyperlink, an Institution ID, and a password.
A reminder notice to non-respondents will be e-mailed approximately two weeks after the survey is launched, and a second reminder will be e-mailed 10-14 days later. For institutions where the e-mail address bounces back, we will send postcards via ground mail that again request correct e-mail addresses. Non-respondents will be contacted via telephone approximately 4 weeks after the survey is launched. The follow-up calls will be continued until an adequate response rate has been achieved. Attachment E includes the text that will be used for the letters and e-mails.
Although the response rate in 2005 was lower than expected (62%), Macro International is using a new web-bases software system and anticipates that some of the problems with administration encountered in 2005 will no longer occur. In 2007 Macro International will be using SPSS Dimensions software system. This new software system has the following advantages:
More accessible: respondents will be able to return to responses entered during an earlier session.
More customizable: survey administrators can tailor the appearance of the instrument to appeal to the target audience
More flexible: e.g. easy to filter response categories based on previous responses, which lowers respondent burden; easy to include previous responses in question text; etc.
More powerful: the programming environment means faster development times, easier de-bugging
More integrated: fully-integrated with other SPSS products means more streamlined data processing and better real-time reporting on progress
While past response rates for earlier versions of this survey have been respectable (76 percent in 1998 and 74 percent in 2000, 62 percent in 2005), if the expected response rate is not achieved after reasonable but diligent efforts to maximize them, the analysis will include a weight adjustment for non-response categories.
B.4 Tests of Procedures
This item does not apply because this survey does not include any tests of procedures or methods.
B.5 Statistical Consultants
Consultants for statistical aspects of the design
Pedro Saavedra, Ph.D.
Statistician
Macro International
11785 Beltsville Drive
Calverton, MD 20740
301/572-0273
Data Collection and Analysis
Kate Rohrbaugh, M.P.S.
Senior Project Manager
Macro International
11785 Beltsville
Calverton, MD 20740
301/572-0288
Gloria DiFulvio, Ph.D.
Senior Evaluator
Higher
Education Center for Alcohol and Other Drug Abuse and Violence
Prevention
c/o
Education Development Center, Inc.
55 Chapel
Street
Newton, MA 02458
413/545-2523
Virginia Mackay-Smith, S.M.
Director
Higher
Education Center for Alcohol and Other Drug Abuse and Violence
Prevention
c/o
Education Development Center, Inc.
55 Chapel
Street
Newton, MA 02458
617/618-2816
Government Program Officer
Richard Lucey, Jr.
Education Program Specialist
U.S. Department of Education
Office of Safe and Drug-Free Schools
400 Maryland Avenue, SW - 3E335
Washington, DC 20202-6450
202/205-5471
Page
File Type | application/msword |
File Title | Supporting Statement |
Author | Kathlyn.A.Goddard |
Last Modified By | Sheila.Carey |
File Modified | 2007-06-29 |
File Created | 2007-06-29 |