Epidemiologic Study of Health Effects Associated With Low Pressure Events in Drinking Water Distribution Systems
0920-0960
Exp. Date: August 31, 2018
Request for OMB Approval of a Reinstatement Information Collection
Supporting Statement B
9/28/18
Contact:
Kathy Benedict, DVM,
PhD
Epidemiologist, Waterborne Disease Prevention
Branch
National Center for Emerging and Zoonotic Infectious
Diseases
Centers for Disease Control and Prevention
Mailstop
H24-9
1600 Clifton Rd. NE
Atlanta, GA 30333
E-mail:
[email protected]
1. Respondent Universe and Sampling Methods 3
2. Procedures for the Collection of Information 5
3. Methods to Maximize Response Rates and Deal with Non-response 10
4. Tests of Procedures or Methods to be Undertaken 13
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 13
List of Reference Attachments *
Authorizing Legislation
60 day Federal Register Notice
Pilot Evaluation Logical Framework
Advance Letter
Cover Letter – Paper
Consent Brochure
Household Survey – Paper
Household Survey – Web – Screen Shots
Thank you/Reminder Letter
Replacement Survey Cover Letter – Paper
Reminder Phone Script
Final Appeal Letter
IRB Approval Continuation
Low Pressure Event Form
Utility Customer Information
*Attachments F and L are no longer used
Epidemiologic Study of Health Effects Associated With Low Pressure Events in Drinking Water Distribution Systems
B. Collections of Information Employing Statistical Methods
We are conducting a prospective cohort study among households that receive water from five water utilities. We also plan to recruit one to two more utilities to provide data for this study. For each low pressure event, the water utility provides the study team with contact information for utility customers from an exposed and an unexposed area. From the utility company lists, CDC randomly selects a number of exposed and unexposed households to be sent the survey. For each low pressure event (LPE), unexposed and exposed households are recruited in a 2:1 ratio. Sample size calculations are based on a matched (at the LPE level) design, thus calculations depend on the number of LPEs and the number of exposed and unexposed households per LPE. Based on calculations outlined below, we aim to include 79 LPEs in the multi-site study, to obtain information from 3,160 households (1,053 exposed and 2,107 unexposed). Based on an anticipated 40% response rate, we will send surveys to approximately 7,900 households (3,160/0.40) during the time period of the study. We are not contacting persons younger than 18 years of age. The household respondents provide information on exposures and illnesses among all household members, and we assume there are two individuals per household, on average. Thus, we should have data on 6,320 persons from the multi-site study.
The potential universe of respondents for this study include all households that receive water from the six to seven participating water utilities. Potential survey respondents have two options for responding to the survey: paper surveys filled out by study respondents and returned via postal mail to CDC study staff; and web-based surveys filled out by respondents who fill in the study survey responses via a password-protected website housed at CDC.
We anticipate a continued response rate of 40%. One recent study related to drinking water conducted in the U.S. had a lower response rate (33% see Ailes et al. in Table B.1.1). In order to increase the anticipated response rate for this study, we incorporated aspects of the Dillman Tailored Design Method. Special attention was given to the presentation and wording of the survey and consent materials. To encourage participation by fostering a social exchange relationship, we include a token gift of a refrigerator magnet calendar that highlights the two week period of interest for the survey. To further foster a social exchange relationship, we include a handwritten post-it note in the second survey mailing. Further, based on results from our pilot study, we estimate that 60% of individuals who respond will use the postal version of the survey, and 40% will respond via the web based version.
Table B.1.1 – Survey Response Rates
First author (country) |
Year of study |
Survey methods |
Overall Response Rate |
Proportion Responding via: |
||
Web |
Telephone |
|||||
Ailes (U.S.) |
2009 |
33% |
-- |
100% |
-- |
|
Griffith (U.S.) |
2008 |
Web |
53% |
100% |
-- |
-- |
Ghosh (U.S.) |
2006 |
Web, telephone |
>78* (telephone), >74% (web) |
-- |
-- |
-- |
Jones (U.S.) |
2004 |
Telephone |
33% |
-- |
-- |
100% |
Smith (U.S.) |
2001-2003 |
Web, mail |
--† |
55% |
45% |
-- |
* There were multiple survey groups per survey method |
||||||
† Longitudinal cohort study follow-up; survey respondents had already agreed to participate in the study but in this round of the study, they were given an option to complete questionnaires via the internet. |
We have performed power calculations to assess whether this study will have enough power to provide statistically useful information. In addition to analyzing data from all participants overall, we plan to stratify our analyses by type of water treatment used (i.e., chlorine versus monochloramine as a secondary disinfectant). Therefore, both of these study arms will individually require sufficient statistical power for analysis. For these power calculations, the following assumptions were made:
The population incidence of AGI among unexposed persons is estimated to be 5% (based on prevalence of AGI during a month-long period in U.S. survey data (Jones, McMillian et al. 2007).
Based on our current event distributions, we know that event sizes have varied. Of the 53 events so far, we have had 32 small events (affecting at least 8 households), 11 medium events (affecting at least 16 households), and 10 large events (affecting at least 39 households). This is different than originally expected, as we have experienced more smaller events. Thus we have increased the total number of events needed from 65 to 79.
In an unstratified analysis, with a 40% response rate and Type I error = 0.05, the study will have over 85% power to detect an odds ratio of 1.6 or higher (Table B 1.2). In each arm of a stratified analysis (e.g., stratifying on disinfectant type), the study will have approximately 80% power to detect an odds ratio for AGI of 1.8 or larger.
Under these assumptions and goals, we estimate that we will have enough respondents to meet the goal of detecting a statistically significant odds ratio of 1.6 or greater in an unstratified analysis (an increased risk similar to the Nygard study), and further that we would be able to detect a statistically-significant odds ratio of 1.8 or greater for AGI within each disinfectant type subgroup. Finding non-significantly elevated risks would suggest that additional research with additional precision and statistical power is needed.
Table B.1.2 – Power Calculations for Overall Study*
Power |
AGI Unexposed |
Odds Ratio |
No. of HHs |
Nonresponse-inflated* No. of HHs |
||||
Exposed HHs |
Unexposed HHs |
Total HHs |
Exposed HHs |
Unexposed HHs |
Total HHs |
|||
85% |
5% |
1.6 |
1225 |
1828 |
3053 |
3063 |
4570 |
7633 |
97% |
5% |
1.8 |
1225 |
1828 |
3053 |
3063 |
4570 |
7633 |
100% |
5% |
2.0 |
1225 |
1828 |
3053 |
3063 |
4570 |
7633 |
*Assumes a response rate of 40% |
|
If we recruit six or seven utilities for the multi-site study, we would need, on average, approximately one low pressure events per month per utility. However, many low pressure events affect smaller numbers of households, thus the number of households invited to participate will vary based on the number of utility connections affected (Table B.1.3). Using the estimated event size distribution, which was informed by our data as of July 2018, we estimate an approximate sample size of 7,633 households in the multi-site study (Table B.1.3).
Table B.1.3 – Distribution of low-pressure event size and number of expected enrolled households (HHs) for overall study1
Event size |
No. of events |
No. of HHs per event |
No. of HHs |
Nonresponse-inflated* No. of HHs |
|||||
Exposed |
Unexposed |
Exposed |
Unexposed |
Total
|
Exposed HHs |
Unexposed HHs |
Total HHs |
||
Small |
48 |
8 |
15 |
384 |
720 |
1104 |
960 |
1800 |
2760 |
Medium |
16 |
16 |
28 |
256 |
448 |
704 |
640 |
1120 |
1760 |
Large |
15 |
39 |
44 |
585 |
660 |
1245 |
1463 |
1650 |
3113 |
Total |
79 |
63 |
87 |
1225 |
1828 |
3053 |
3063 |
4570 |
7633 |
1Assumes the proportion of AGI among unexposed persons as 5%, odds ratio of AGI as 1.6, 85% of power, and 5% of type I error, an event size distribution of 48 small, 16 medium, and 15 large events for 79 events in total. *Assumes 40% of response rate (e.g., 960 = 384/0.4) |
For a detailed discussion of sample selection, sample size, and statistical power for this study see Section B.1.
Identification of Potential Study Participants
Identification of exposed and unexposed areas associated with each low pressure event is undertaken collaboratively between the utility and the CDC study team, including individuals with distribution system engineering expertise. Water utilities provide information on as many LPEs as possible each month. Exposed households are identified as addresses that experience a loss or lack of water pressure due to an LPE in the water distribution system. The utility identifies the households affected by the LPE with the assistance of their water system information systems, and convey this information to CDC, using street names and a map if available. We do not ask residents to participate in the study more than once, therefore events that occur in areas that were previously sampled as either exposed or unexposed areas are not eligible.
Using the address of the event and map of affected households, the CDC study team determines the census block(s) affected, as well as the census block group and census tract. A map showing these areas will be provided to the utility to narrow down the area in which to seek a group of homes that are comparable to the homes affected by the LPE. Because census tracts are designed to be homogeneous with respect to population characteristics, identifying unaffected areas within the same block group (or census tract for large events) will help to minimize bias by assuring that affected and unaffected areas are reasonably well matched in terms of demographic characteristics. Using its own water system information systems, in collaboration with the CDC study team, the utility then identifies similarly-sized or larger areas within the same census block group (or census tract if necessary) that are unaffected by the LPE because they are in a different pressure zone, upstream of the event, involve a separate loop, or are in an area of the distribution system separate from the area of the LPE. From the potential areas identified, utilities identify an area that matches the affected (event) area in terms of predominant housing type, pipe size and material, and origin of water, as detailed on the Low Pressure Event form. If multiple potential matching areas are found, CDC randomly selects the area for inclusion.
The utility provides the billing addresses of the exposed and unexposed households or the mailing addresses from public assessor information to CDC research staff within one week of an LPE. From this address list, a random sample of exposed and unexposed households are generated by CDC study staff (see section B.1 for details on sampling). Households are only surveyed once (either as a household exposed to an LPE or as an unexposed household).
Eligibility
Household respondents must be English-speaking and ≥18 years of age in order to be eligible to participate in the study. Children are not targeted to be enrolled in the study per se, as we do not survey anyone <18 years of age. However, the parent or guardian of a child may respond to the questionnaire regarding recent illness and exposures of children in his/her household. Households that have been invited to participate in response to a previous event are not included in the sampling frame for subsequent events.
Water connection addresses that are linked to businesses are excluded. Residential facilities such as nursing homes or long-term care facilities are also excluded because populations in these facilities generally have a higher rate of underlying medical conditions which predisposes them to a higher baseline incidence of diarrheal illness. They are excluded to avoid bias. To exclude potentially ineligible residential properties (e.g., rental properties or seasonal homes), we exclude addresses for which the mailing and premise addresses do not match.
Advance letter
All potential study participants have advance notice of the survey. An advance letter with information about the study (Attachment D) is mailed to each selected household approximately 1-1.5 weeks after the low pressure event. The letter is personalized with the utility customer’s name and home address using CDC stationary, and is signed by the principal investigator. This letter introduces the importance of the study and informs residents that they will receive a survey packet in a few days, with options to complete the paper survey and mail it in or to complete it online using a unique code. The advance letter also informs study participants that they will receive a small gift (the refrigerator magnet) in their study packet as a token of our appreciation for their participation.
Enrollment
A survey packet, including an introductory letter, study brochure with consent information (Attachments E and G) and the household survey (Attachments H and I), are mailed from CDC to each randomly selected exposed and unexposed household approximately two weeks after the LPE (i.e., 0.5-1 week after the advance letter). Mailed study materials also direct participants to a study website (http://www.cdc.gov/healthywater/study.html) where they can seek more information about the study or link to the secure study website to take the survey online. In addition, a refrigerator magnet depicting a calendar is included with the designated two week period of interest highlighted as a reminder for the household to only answer about those days indicated. Two to three weeks is generally the maximum incubation period for pathogens known to cause waterborne AGI and ARI. Therefore, those persons who were exposed to these pathogens during the LPE will likely have developed symptoms by the time they receive the questionnaire. The letter includes information about the study and invites the recipient to participate. In order to maximize response rates, we give survey participants the choice of two response modes: over the internet using a web-based survey tool, or via postal mail. The introductory letter provides a study identification number that the respondent uses as a pass code to enter on the survey website, should the respondent prefer to complete the survey online. This information is duplicated in a label on the front of the survey booklet. We include a postage-paid return envelope (addressed to CDC) for return of paper surveys. The calendar magnet included in the survey packet has the dates of the 2-week period of interest highlighted. By utilizing multiple survey methods, we hope to have equitable recruitment of subjects.
We attempt to obtain survey information on all members of each household (limited to 6 in the paper version of the survey due to printing constraints). Study participants receive a refrigerator magnet calendar from the CDC as a visual aid to improve recall and as a token gift designed to improve response rates.
Consent Process
The survey participants indicate their consent to participate by completing the survey (in the postal and online versions of the survey). Instructions clearly state that only individuals > 18 years of age are eligible to fill out the questionnaire. No names are collected at any time during the study.
Survey Questionnaire
Both versions of the study questionnaire ask the same standardized questions from each household member concerning his or her recent gastrointestinal or respiratory illnesses, water service and exposures, and other activities. The questionnaire focuses on the two week time period after the LPE (referred to in the survey as the “two week period”). For households where children <18 years of age are present, we ask a parent or guardian to answer questions and provide information on behalf of the child. Data from the paper-based surveys is manually entered into the electronic database at CDC that also houses the data from the web-based surveys. Data from the web-based surveys is saved automatically in the database after the participant answers a question.
Thank You/Reminder Notecard
One week after mailing the survey packet, we send a personalized thank you/reminder notecard to the selected exposed and control households (Attachment J). The notecard thanks participants who already responded and request that those who have not yet responded do so. Participants are reminded that they are important to the study, that their responses are useful, and that their participation is voluntary. The participant’s individual login and passcode for the web survey is included on the card. .
Replacement Questionnaire Mailing
One week after mailing the thank you/reminder card, we mail a replacement questionnaire packet to households that have not yet responded. The packet contains essentially the same information as the first questionnaire mailing, but the letter is altered slightly to indicate that we have not yet received their response, remind them that participation is easy and quick, and provides important information for public health planning (Attachment K), and a hand-written Post-It note encouraging them to respond.
Reminder Telephone Call
One week after mailing the replacement questionnaire, the study team places an outbound telephone call as a reminder before the final appeal letter is sent in an attempt to further increase the response rate (Attachment M). This call is currently only feasible in a single utility due to several factors, including: state laws restricting the sharing of phone numbers and data systems at utilities may not be able to provide customer phone numbers the file needed,.
Final Appeal Letter
One week after placing the reminder telephone call, we send a final mailing to non-responders (Attachment N). This final letter is in a different color envelope, contain a personalized letter on CDC letterhead with the signature of the principal investigator, and have a reminder of the participant’s individual passcode for the web survey. This letter conveys a sense of urgency that time is running out for participation, and inform participants that all responses must be received within two weeks. Responses received by mail or internet more than 8 weeks from the low pressure events are still accepted, and a sensitivity analysis will be used to determine whether these late responses introduce recall bias.
Low Pressure Event Information
In addition to the information collected from the study participants, data is collected from the water utility about the LPE using the LPE form (Attachment P). The utility repair crew completes one LPE form per event, and a utility manager reviews and approves the form before sending the form to CDC. Prior to the final selection of exposed and unexposed areas, the CDC study team reviews the information and requests follow-up from utility staff for missing or unclear information. The water utility provides information on the condition of the distribution system infrastructure (e.g., age and type of pipe material), the duration of the LPE, the extent of the areas of the population affected (e.g., the number of households), location of wastewater and storm water collection systems in relation to the impacted water line, and procedures used to restore the distribution system after the LPE. Hydraulic model outputs and monitoring data including total coliforms, disinfectant residuals, turbidity and other information that relates to the microbiological status of the distribution system may also be collected.
We also conduct supplemental environmental sampling. For each LPE, utility company personnel dispatched to perform repairs or maintenance collect water samples from the distribution system in both the exposed area and the unexposed area from hydrants, sampling points, or commercial and residential hose bibs within 24 hours of the LPE. Water sample sites (selected by the water utilities) are not linked to household survey addresses; rather, they are assumed to represent water in the distribution system in the affected and unaffected areas. The following is a potential sampling schedule:
• Three, ~100-L drinking water samples from exposure area within 24 hours of LPE
• Three, ~100-L drinking water samples from unexposed area within 48 hours of LPE
Water samples are collected using a published ultrafiltration technique (Smith and Hill 2009). Environmental sampling collection supplies and pre-paid shipping labels are provided to participating utilities. Depending upon their laboratory capacity, some of the environmental analyses are conducted at the utility’s own laboratory, but most samples are shipped in coolers to the CDC Environmental Microbiology Laboratory for testing within 36 hours of collection. The CDC laboratory regularly responds to waterborne disease outbreaks across the U.S. and has extensive experience training partners. The team has monitoring systems, protocols, and training materials in place to ensure that the CDC team can monitor the water sample collection and mailing. Compliance with chain of custody procedures are monitored and enforced. A utility manager notifies the CDC when an event occurs, and the CDC team follows up in the event that there is a delay in receiving samples.
The drinking water samples are analyzed for a suite of microbial indicators of fecal contamination including, total coliforms/E. coli, somatic coliphages, Bacterodales spp., total aerobic spores, heterotrophic plate count (HPC) bacteria, Adenosine Triphosphate (ATP), and Pseudomonas aeruginosa.
Information on weather and climate variables (especially rainfall immediately preceding, during, and following the low pressure episode) will be collected and analyzed in context of the study by CDC staff using the National Oceanic and Atmospheric Administration’s (NOAA) national climate database (http://www.climate.gov/).
Quality Control Procedures
Several quality control procedures are implemented as part of this study:
The web-based questionnaire has built-in skip patterns and internal logic controls for efficiently routing the respondent to the relevant questions based on their prior responses. The paper-based questionnaire has the same questions and the same skip patterns as the web-based version. However, participants completing the paper-based questionnaire must manually follow these skip patterns, which may increase the risk for data entry errors. Additionally, the web-based questionnaire will employ a variety of prompts to encourage survey completion, whereas the paper-based questionnaire has no such prompts.
The web-based questionnaire has data entry validation to limit data entry errors and reduce data cleaning efforts. Furthermore, data entry into the database is automatic thereby eliminating the need for manual data entry, which also limits potential data entry errors.
The CDC study team will manually clean the database at the end of the data collection period.
The Low Pressure Event form is reviewed by the CDC study team following each event before participants are selected, and the utility staff is contacted for completion of missing information.
Utility staff are trained in following the CDC lab team’s standard chain of custody procedures, and the CDC team follows up with the utility if samples are not received in the allotted time following LPEs.
Participation in our study is completely voluntary and no financial incentives are offered. Still, we have attempted to maximize response rates through the following:
Utilizing both mailed and web-based survey methods and giving potential participants an option of how they would prefer to respond to the survey;
Keeping the survey length to a minimum, and using a respondent-friendly survey booklet. We estimate that it takes approximately 12 minutes for each participant to complete either version (web or paper) of the questionnaire.
Incorporating additional aspects of the Dillman Tailored Design Method, which have been shown to improve response rates (Dillman 2007): sending an advance letter; sending a personalized cover letter with the mailed questionnaire; including a token gift (refrigerator magnet calendar); sending a thank you/reminder postcard one week after mailing the survey packet; sending a follow-up survey mailing to households who have not responded to the survey after two weeks and including a personalized post-it note; and sending a final mailing in a contrasting envelope.
To avoid making persons feel pressured or coerced into consenting to participate in the study, all communications, including the advance letter, cover letter, follow-up mailings, and the information provided at the beginning of the web-based questionnaire, advise readers that participation in the study is voluntary.
In the pilot study, the overall survey response rate was 37%, the response rates for each event ranged from 32-43% and did not appear to vary by event type or size. Additionally, the response rates were similar in LPE (38%) and non-LPE areas (36%). The non-response rate was < 10% for the main survey items. There was higher non-response for follow-up questions for the illness details onset date and number of days with symptoms. Other illness detail questions such as the number of days of school or work missed because of the illness, or healthcare seeking behavior were nearly complete. Skip patterns were built into the web survey interface; for the paper surveys, data from the mail surveys indicated that the majority of respondents correctly skipped or answered the appropriate questions. System data demonstrated that the web-based survey took customers a median of 11 minutes to complete, suggesting that the time burden for participation was low and in line with the anticipated time of 12 minutes. Most (94%) of the web surveys were submitted successfully, demonstrating there was limited survey break-off.
To improve response rates in the full study, the study team has modified the survey procedures and will increase efforts to promote the study in the participating communities. Nearly half of the page views for the study website (46%) occurred in the 30 days following the press release, before any survey materials were mailed out, suggesting that additional ongoing publicity in the study communities has potential to motivate participants. The study team will implement additional community outreach throughout the study period to improve community acceptance of the study and to boost response rates. Possible methods include implementing local public service announcements or periodic advertisements in local print media.
The multiple mailing contact strategy encouraged participation, evidenced by boosts in survey response following the mail prompts. The majority of respondents chose to return the survey by postal mail (70%), using the provided return envelope. The data quality of the web surveys was higher than the paper surveys because data verification rules and question skip patterns were built into the survey interface. Since the web survey instructions and access information were printed on the survey materials, rather than sent electronically to customers, it might have been inconvenient for respondents to access a computer, type the link to the website, and log-in to take the survey. Additionally, the study team adds an outbound phone call as a reminder before the final appeal letter is sent for utilities where this is feasible.
Several recent studies have indicated that a response rate of 80% is not feasible for this study. A postal survey of households in a community affected by drinking water-associated outbreak of salmonellosis that also asked about acute gastrointestinal illness had a response rate of 33% (Ailes, Budge et al. 2010). Response rates obtained by some large continuing health surveys that provide key national health statistics (such as the FoodNet population studies) have response rates of 33% in a past survey cycle (Jones, McMillian et al. 2007). In spite of these low response rates, such surveys still provide valuable data to inform public health recommendations, policies, and programs, and we believe that our study will have similar value. Compared to these prior studies, we implement several procedures to attempt to encourage study participation, as outlined above to maximize response rate. Notably, the household survey for this study is substantially shorter than the other surveys mentioned in this section, and includes a token gift.
Our primary concern regarding obtaining low response rates is the potential for bias engendered by differential response based on exposure and outcomes. If the response is low, we are also concerned that, overall, the study population may not be representative of the target population (i.e., non-response bias). However, recent research indicates that the magnitude of non-response bias is generally small (Galea and Tracy 2007).
Following the pilot, we assessed item non-response, and minor changes were made to the survey instrument to improve the quality of the data and facilitate ease of participation, which were approved in a nonsubstantive change request. The pilot survey helped the study team identify items that could be eliminated, information gaps that could be filled, and response options that should be added to existing questions. The survey recall time period was shortened from 3 weeks to 2 weeks, and the illness questions and water use questions were simplified. The survey supplemental materials were modified slightly to make them easier and faster to read. For example, the advance letter was edited to improve clarity and readability, and the web survey has fewer questions per screen to make the survey easier to read. The study team does not have the resources to provide financial incentives. Although 94% of the web survey respondents successfully completed the survey, indicating survey break-off was not a major concern, additional measures were taken to encourage respondents to finish the web survey, including reformatting the survey and simplifying instructions to make the survey appear shorter and to reduce reading time.
To assess differential response following the multi-site study, we will compare responders and non-responders with respect to their a priori exposure status. If the response rates in exposed areas differ from response rates in unexposed areas by >20% (contrary to findings from the pilot study), we will plan to implement sensitivity analyses to estimate how greatly these might have affected results regarding the primary research question. We will compare demographic characteristics of respondents to census data at the Public Use Microsample Area (PUMA) level. While this level of data is not granular enough to determine whether data from each LPE cluster is representative of its respective census block group, it should be able to identify gross deviations from expected distributions of demographics across all events.
Next, we will evaluate item non-response and incomplete surveys (i.e. break-offs). Items that are missing for more than 10% of respondents will be evaluated for wording, structure, and placement in the survey. Incomplete surveys (break-offs) will be evaluated to determine where the break-off occurs, and whether break-offs differ on the paper and web versions of the survey. We will evaluate whether missing data is differential by exposure status or demographic characteristics, including household size and composition. We anticipate that outcome data may sometimes be missing for adults who share a household but may not be relatives (i.e. roommates), but expect that outcome data for related household members (such as the respondent’s children) should be available and will be valuable to the study.
The data collection instruments and the participant information materials were piloted at one water utility site and evaluated by CDC. Overall, the pilot demonstrated that the study design and procedures will allow the study team to collect the data needed to meet the study goal and aims.
Prior to the pilot, the data collection instruments and the participant information materials (e.g., letters, websites) were reviewed (1) for content by three senior epidemiologists at CDC; (2) for question design, flow, clarity, and timing by five CDC staff members; and (3) for content by experts in the field. The questionnaire and participant information materials were also pilot-tested by six CDC staff members, and three members of the general public.
Statistical consultants for study design:
Sarah Collier, CDC –ph. 404.718.4875, email: [email protected]
Anna Blackstock, CDC*
Gordana Derado, CDC*
Tracy Ayers, CDC*
Xin (Lucy) Lu, CDC/ORISE (Tracy Ayers is supervisor)*
Persons who designed the data collection:
Sarah Collier, CDC –ph. 404.718.4875, email: [email protected]
Vincent Hill, CDC – ph. 404-718-4151, email: [email protected]
Bonnie Mull, CDC*
Elizabeth Adam, CDC/ORISE *
Elizabeth Ailes, formerly contractor for CDC*
Joan Brunkard, CDC*
Julia Gargano, CDC*
Joseph Carpenter, CDC*
Mark Lamias, contractor for CDC*
Persons who collect/collected the data:
Alexis Roundtree, contractor for CDC –ph. 404.718.5816, email: [email protected]
Elyse Phillips, CDC/ORISE ph. 770-488-5922, email: [email protected]
Kathy Benedict, CDC –ph. 404-718-4388, email: [email protected]
Katie Fullerton, CDC –ph. 404.671.0751, email: [email protected]
Marissa Vigar, CDC/ORISE –ph. 404-518-5700, email: [email protected]
Mia Mattioli, CDC –ph. 404.718.5643, email: [email protected]
Sarah Collier, CDC –ph. 404.718.4875, email: [email protected]
Tuyen Do, CDC –ph. 404.718.1375, email: [email protected]
Vincent Hill, CDC – ph. 404-718-4151, email: [email protected]
Alison Stargel, CDC*
Chandra Schneeberger, contractor for CDC *
Elizabeth Adam, contractor for CDC*
Julia Gargano, CDC*
Mark Lamias, contractor for CDC*
Persons who will analyze the data:
Elyse Phillips, CDC/ORISE ph. 770-488-5922, email: [email protected]
Kathy Benedict, CDC –ph. 404-718-4388, email: [email protected]
Katie Fullerton, CDC –ph. 404.671.0751, email: [email protected]
Marissa Vigar, CDC/ORISE –ph. 404-518-5700, email: [email protected]
Mia Mattioli, CDC –ph. 404.718.5643, email: [email protected]
Sarah Collier, CDC –ph. 404.718.4875, email: [email protected]
Sunkyung Kim, –ph. 404.718.1478, email: [email protected]
Vincent Hill, CDC – ph. 404-718-4151, email: [email protected]
*No longer active on the project
References
Ailes, E., P. Budge, et al. (2010). Economic and Health Impacts Associated with a Salmonella Serotype Typhimurium Drinking Water Outbreak -- Alamosa, CO, 2008. International Conference on Emerging Infectious Diseases, Atlanta, GA.
Dillman, D. A. (2007). Mail and Internet Surveys: The Tailored Design Method. New Jersey, John Wiley & Sons, Inc.
Galea, S. and M. Tracy (2007). "Participation Rates in Epidemiologic Studies." Annals of Epidemiology 17(9): 643-653.
Ghosh, T. S., J. L. Patnaik, et al. (2008). "Internet- versus telephone-based local outbreak investigations." Emerg Infect Dis 14(6): 975-977.
Griffiths, S. L., R. L. Salmon, et al. (2010). "Using the internet for rapid investigation of an outbreak of diarrhoeal illness in mountain bikers." Epidemiol Infect 138(12): 1704-1711.
Jones, T. F., M. B. McMillian, et al. (2007). "A population-based estimate of the substantial burden of diarrhoeal disease in the United States; FoodNet, 1996-2003." Epidemiol Infect 135(2): 293-301.
Smith, B., T. C. Smith, et al. (2007). "When epidemiology meets the Internet: Web-based surveys in the Millennium Cohort Study." Am J Epidemiol 166(11): 1345-1354.
Smith, C. M. and V. R. Hill (2009). "Dead-end hollow-fiber ultrafiltration for recovery of diverse microbes from water." Appl Environ Microbiol 75(16): 5284-5289.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | CDC User |
File Modified | 0000-00-00 |
File Created | 2021-01-20 |