Sexually Transmitted Infection Services at U.S. Colleges and Universities
OMB # 0920-xxxx
New Information Collection Request
Supporting Statement
Part B
March 21, 2014
Submitted By:
Melissa Habel, MPH, Health Scientist
Division of STD Prevention, Centers for Disease Control and Prevention
National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention
1600 Clifton Road Mailstop E44, Atlanta, GA 30333
404 639 6462 Phone 404 639 8622 Fax
TABLE OF CONTENTS
Section B
B. Justification
1. Respondent Universe and Sampling Methods
2. Procedures for the Collection of Information
3. Methods to Maximize Response Rates and Deal with Non-response
4. Test of Procedures or Methods to be Undertaken
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
Sexually Transmitted Infection Services at U.S. Colleges and Universities
The following describes the data collection procedures.
The respondent universe comes from the Integrated Postsecondary Education Data System (IPEDS). The IPEDS is a comprehensive census survey of institutions whose primary purpose is to provide postsecondary education. We created our sampling frame using the IPEDS survey of all academic, vocational, and continuing professional education institutes/programs in the U.S. and District of Columbia and outlying areas. We used the 2011 institution characteristics component, which is one of the nine components of IPEDS to create a master list of schools. The institutional characteristics include name and address, levels of degrees and awards, control or funding (i.e., public vs private), type (i.e., 2- vs 4-year), room and board, etc. In addition to the institutional component, data for this project were also obtained from the enrollment component of IPEDS that provided information on the number of students enrolled, level of study, gender, and race/ethnicity.
We included only active, 2- or 4-year degree granting accredited public or private schools that enrolled at least 500 undergraduates and/or graduate students located in the 50 states and the District of Columbia. The total in our respondent universe was 2,753 schools; from which we sampled a stratified random sample of 885 universities and colleges to survey.
The sampling frame of 2,753 schools was stratified by institution sizes, which were determined by enrollment categories; 501 – 1000, 1001 – 2000, 2001 – 8000, 8000 – 16000, and ≥ 16001 students. We oversampled for larger schools because those schools are more likely to have a health center. Smaller schools are also oversampled because one of the purposes for this survey is to describe alternative solutions for providing health care services and referral systems for schools not likely to have a student health center. The proportion sampled between each strata were not equal. However, within each strata schools were selected randomly and with equal probability. The range of sampling rates for the strata was 0.24-0.50. This non-proportional stratified sampling was done using the complex sampling module included with SPSS version 20. Using this method of sampling will produce weights, which subsequently can be used in the analysis to produce unbiased estimates of all U.S. schools and colleges, but restricted to our inclusion criteria of only including active, 2- or 4-year, degree granting, accredited public or private schools, that enrolled at least 500 undergraduates and/or graduate students located in the 50 states and the District of Columbia.
B.2.1 Recruitment
The investigators will email an introductory letter inviting the contact person at each school to participate in the survey, noting that the questionnaire should be completed by the person with the most knowledge and access to information about health services on campus (Attachment 3). After reading and agreeing to terms outlined in the letter, the participant will click the included link to the self-administered electronic questionnaire (via SurveyMonkey). Schools will have 3 weeks to respond to the survey. Investigators will send a reminder at 1.5 weeks, 3 days prior to closeout, and then day of. This may need to be extended in order to achieve adequate power for analyses. Once all the surveys are returned, two researchers will review and contact schools about inconsistent or invalid responses, and make corrections as needed. Basic school characteristics will be gathered from the IPEDs database on each school (e.g. institution type, funding type, size of enrollments, region, etc.).
B.2.2 Screening and scheduling procedures
Once an individual opts in and clicks the link in the invitation email, the potential participant will be taken to the beginning of the survey.
B.2.3 Data collection methods
Individuals who agree to participate in the survey will be able to access the survey by agreeing to participate on the landing page. Data from completed surveys will then be compiled into a separate SPSS dataset for analysis.
The survey will be self-administered and accessible any time of day for a designated period. Upon initial log-in, potential participants who indicate willingness to participate will be given general information about the survey, topics to be covered in the survey, and potential risks of participation. Once participants indicate their consent to participate, they will proceed directly to the online survey (Attachment 4). Study participants will be given a designated period during which the survey will be available for them to complete, making it feasible for participants to complete the survey during their own time, in private.
A list of contacts from each school will be compiled using the schools website and from pre-existing contacts lists for schools and universities. The persons on this list will be the best person to initiate contact with and ask to be a participant in the survey. Participants will have 3 weeks to respond to the survey. Investigators will send reminders at 1.5 weeks, 3 days prior to closeout day and the day of. Investigators will review and contact schools about inconsistent or invalid responses, and make corrections as needed. Analysis of survey data will be weighted to obtain representative estimates about U.S. colleges and universities and non-response will be reflected in the weights used for final analysis.
This submission is a request for authorization to collect data using methodologies typical in project evaluation. Once the survey is programmed into SurveyMonkey, a mock invitation request will be sent to the project group members listed below in section B.5. An email invitation letter (attachment 3) will be sent to these members as if they were the targeted participated. The team members can provide feedback on the usability and content of invitation letter (i.e., hyperlinks and pertinent information of the study). One team member will be instructed to opt of taking the survey to simulate a potential respondent opting out. This will also attest to the tracking ability of SurveyMonkey via email invitation as an opt-out receipt should be reflected in the survey system. The remaining team members will complete the survey as a normal participant would, however, being mindful of informational errors (i.e., grammar and spelling) and more importantly design errors in the survey (i.e., skip patterns).
Melissa Habel, MPH
1600
Clifton Rd. NE
Mailstop E-44
Atlanta, GA 30329
Telephone: (404) 639-6462
E-mail: [email protected]
Jeffrey Becasen, MPH, CPH
1600
Clifton Rd. NE
Mailstop E-44
Atlanta, GA 30329
Telephone: (404) 639-6460
E-mail: [email protected]
Patricia Dittus, PhD
1600
Clifton Rd. NE
Mailstop E-44
Atlanta, GA 30329
Telephone: (404) 639-8299
E-mail: [email protected]
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Becasen, Jeffrey S. (CDC/OID/NCHHSTP)(CTR) |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |