SSB_children BSS_05292020_30Day_111622020_clean

SSB_children BSS_05292020_30Day_111622020_clean.docx

Oral Health Basic Screening Survey for Children

OMB: 0920-1346

Document [docx]
Download: docx | pdf




Information Collection Request

Existing Without an OMB Control Number



Oral Health Basic Screening Survey for Children



Supporting Statement B





Program Official/Contact

Mei Lin, MD, MPH, MSc.

Epidemiologist

Division of Oral Health

National Center for Chronic Disease Prevention and Health Promotion, Division of Oral Health

Centers for Disease Control and Prevention

Atlanta, Georgia

Phone: (770) 488-5109

Fax: (770) 488-6080

E-mail: [email protected]


October 15, 2020






ATTACHMENTS

1. Public Health Service Act [42 U.S.C. 241]; Oral health promotion and disease prevention [Section 247b-14]

2a. Instruction manual

2b. Supplemental guidance on sampling design

2c. Supplemental guidance on data analysis

2d. Invitation to schools to participate

2e. Consent forms

2f. Screening fields form

2g. Notice of screening results

2h. Request and reminder email to state respondents for data

2i. Data form submitted by state respondents

3a. 60 day Federal Register Notice

3b. 60 day Federal Register Notice public comments and agency response

4. Institutional Review Board non-research determination

5. ASTDD sample screening budget







B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

The oral health Basic Screening Survey (BSS) of school and Head Start children is a state-tailored survey administered and conducted independently by individual states. CDC has been supporting some of the 50 states to build and maintain their oral health surveillance system and the Association of State and Territorial Dental Directors (ASTDD) to provide technical assistance to states through state and partner cooperative agreements since 2001. Conducting BSS for third graders is a key component of that support. To ensure standardized, quality data collection and analysis, ASTDD in consultation with CDC provides the survey protocol in an instruction manual as well as step-by-step guidelines on BSS sampling design, sample size determination, weighting and data analysis (www.astdd.org/basic-screening-survey-tool/) (Attachments 2a–2c). In addition, ASTDD provides customized technical assistance to states, either funded by CDC through the cooperative agreement or paid for by states themselves. States collect and analyze the BSS individual-level data and calculate state-aggregated prevalence data for dental caries and sealants. ASTDD emails states to request state-aggregated data only (Attachments 2h and 2i), then reviews it to ensure that survey design and data meet specific criteria before sending the data set to CDC to be posted on the CDC’s Oral Health Data (OHD) website (www.cdc.gov/oralhealthdata). Though states tailor the BSS to suit their needs and available resources, data quality and standardization are ensured through training and technical assistance provided by ASTDD with CDC support.

B1. Respondent Universe and Sampling Methods

Respondent universe

The target populations for the children’s BSS include elementary school children in grades K–3 and children enrolled in the Head Start program in 50 states and Washington, DC. Each state decides for itself which grade or grades to target and whether to conduct Head Start BSS. The Association of State and Territorial Dental Directors (ASTDD) and CDC recommend that at least third grade children be screened. Forty-seven states have ever conducted BSS for children, and all 47 conducted Third Grade BSS. Thirty-two states also have conducted BSS for children in other grades or in Head Start.

Sampling methods

A probability sample of schools is drawn for each state by the state oral health program. To ensure the standardization and representativeness of the sampling, ASTDD provides states guidelines on BSS sampling, sample size determination, weighting, and analysis (https://www.astdd.org/basic-screening-survey-tool/) (Attachments 2b and 2c). ASTDD also provides customized technical assistance to states.

BSS data should come from a probability sample representative of a state’s elementary school children in the targeted grade(s) or of that state’s Head Start enrollees. The sampling employs appropriate stratification and cluster sample selection. Schools or Head Start sites represent natural clusters. Specific sampling schemes are recommended, with aim of ensuring representative sample while allowing some flexibility to accommodate state-specific program needs and resources, such as determining stratification factors and using probability proportional to size (PPS) or non-PPS to select sample schools or sites.

  • Prepare a sampling frame. The state obtains a statewide list of schools for the target grade from the state Department of Education (or a list of Head Start sites from the state Head Start Office for Head Start BSS). At minimum, the list should include school name, school ID code, district name, district ID code, enrollment by grade, total enrollment, county or region, and number or percentage of children participating in the National School Lunch Program (NSLP).

  • Stratify the sampling frame. Stratification factors are determined by individual states. The common factors include 1) geographic factors such as county, region or district; 2) urban or rural area; and 3) percentage of students participating in NSLP. ASTDD recommends that all school BSS, at minimum, use stratification (preferably implicit) by percentage of students participating in NSLP. Most states use multiple levels of stratification. Stratification can be either explicit, with sampling within each stratum, or implicit, with systematic sampling from a list sorted by stratification variables.

  • Select the sample of schools. States can choose either PPS sampling of schools or probability sampling of schools without regard to school size (non-PPS). PPS sampling with a consistent number of children screened at each school can result in an efficient scheduling of screeners while ensuring proportionate representation of children from clusters of different sizes. With a non-PPS sampling of schools, each school, regardless of size has an equal probability of being selected; for a self-weighting analysis, all children in the target grade should be screened.

  • Replace refusing schools. Refusals should be replaced with the same probability methods as the original selections (i.e., systematic PPS or non-PPS sampling). The replacement should be selected from the same sampling interval as the refusing school, so the sampling interval is represented.

Expected response rate

State-level response rate. ASTDD and CDC recommend that states conduct BSS at minimum for third graders at least once every 5 years. Individual states determine how often to conduct BSS and which grade or grades to target based on their surveillance needs and available resources. Twenty states currently funded by CDC through a cooperative agreement (State Actions to Improve Oral Health Outcomes, CDC-RFA-DP18-1810) must, as a condition of their award, implement at least one Third Grade BSS within the five-year project period. Based on analysis of current BSS data and consultations with the states, CDC estimates that approximately 34 states (67% of 50 states and Washington, DC) will conduct at least one Third Grade BSS and respond to this data collection during the period for which this approval is being sought. About 19 of those 34 states are projected also to survey children enrolled in the Head Start program or in one or more additional grades.

Child-level response rate. The average child-level response rate for the latest Third Grade BSS and K–2 or Head Start BSS in 5 years over the 34 states is 59%. The response rate is anticipated to remain for the foreseeable future.

Sample size

The total number of children screened during the most recent BSS for the 34 states was 150,370. The sample size varies by state with the number of children screened ranging from 360 to 10,404 in the most recent Third Grade BSS. We assume that the anticipated participant size will remain at approximately 150,370. The sample size is determined by individual states based on statistical considerations and can be adjusted to accord with a state’s resources and surveillance needs . To ensure appropriate sample size, ASTDD provides both step-by-step guidelines on how to calculate and determine BSS sample size (https://www.astdd.org/basic-screening-survey-tool/) (Attachment 2b) and customized technical assistance. The sample size determination considers level of statistical precision, projected proportion of the BSS indicator, design effect to reflect the effects of complex sample design, adjustment appropriate for the subpopulation level of interest (e.g., state, region, county), and the anticipated response rate. Sample size can be further adjusted to accommodate resource factors such as available funds, time, and number of trained screeners.

B2. Procedures for the Collection of Information

The BSS is administered and conducted independently by individual states, with technical assistance and support from ASTDD and CDC.

Roles and responsibilities of states, ASTDD and CDC

  • State health departments. The state health department administers the survey by determining probability samples representative of their target grade (or children enrolled in a Head Start program), arranging logistics with the selected schools or Head Start sites, gaining consent, obtaining demographic data, training screeners, conducting the non-invasive oral observation, verifying and analyzing the data, and submitting the de-identified state-aggregated data to ASTDD. The 20 states, currently funded by CDC through the five-year cooperative agreement State Actions to Improve Oral Health Outcomes (CDC-RFA-DP18-1810) are required to administer the BSS and submit and report state aggregated data for third grade children at least once during the project period.

  • ASTDD. To ensure data collection standardization, representativeness and quality, ASTDD in consultation with CDC provides the survey protocol in an instruction manual and also provides guidelines on the BSS sampling design, sample size determination, weighting and data analysis (Attachments 2a–2c). In addition, ASTDD provides customized technical assistance to states, either funded by CDC through the cooperative agreement or paid for by the state. ASTDD is responsible for requesting state-aggregated BSS data from states and verifying that data meet the criteria for inclusion in the National Oral Health Surveillance System (NOHSS) before sending to CDC for publication on the CDC’s OHD website (www.cdc.gov/oralhealthdata). Current CDC support to ASTDD for BSS is through a 5-year cooperative agreement, Partner Actions to Improve Oral Health Outcomes, CDC-RFA-DP18-1811, which began in September 2018.

  • CDC. Since 2001 CDC has, through a series of cooperative agreements, supported state efforts to build and maintain state oral health surveillance system with BSS as a key component. Funded states are required to conduct BSS at least for third grade children. ASTDD, through a series of CDC partner cooperative agreements, provides training and technical guidance and assistance to states conducting BSS. CDC provides technical consultation during periodic revisions of the BSS protocol and relevant materials. CDC collects state-aggregated BSS data through ASTDD and posts the data on the OHD website on an ongoing basis. CDC maintains the OHD and its functionalities to enable CDC, states, and the public to monitor state oral health surveillance data, compare data across states, and use data for actionable public health practice. CDC also uses the state-aggregated BSS data to evaluate performance of the states funded through the cooperative agreement.



Procedures of data collection

The BSS protocol and supplemental guidance documents (attachments 2a–2c) provide step-by-step procedures for BSS data collection, which are summarized below.

Planning phase procedures:

  • Form a survey advisory committee that includes key stakeholder organizations to guide the development of a survey plan.

  • Determine the purpose of the survey and the target population in accordance with state surveillance and program needs.

  • Determine the level of available funding, the sampling plan as described earlier, whether an optional parent questionnaire is needed, and the method of data recording.

  • Contact the state Department of Education (DOE), school districts, and Head Start office to gain their support and determine the protocols for consent, approvals, and privacy as well as the methods for collecting demographic information.

  • Identify and train screeners. The state is responsible for training screeners at least once before each BSS cycle by providing didactic training and clinical training that includes verifying each screener’s assessment of 20 children from the target grade level (Attachment 2a). Trainers are also required to review the ASTDD training materials.

Implementation phase procedures:

  • States invite selected schools to participate and provide a point of contact for logistics (Attachment 2d) and parent or guardian consent.

  • States determine whether to utilize a passive or positive consent process as instructed by schools, districts or the state DOE. The BSS manual (Attachment 2a) recommends using passive consent to improve the response rate.

  • Some states accompany the consent process with an optional parent questionnaire, which queries family use of and access to dental care. The questions align with those on the national surveys such as National Health and Nutrition Examination (NHANES) and National Survey of Children’s Health (NSCH).

  • Screeners generally spend one day per school and 1–2 minutes per student surveying for four data points in the non-invasive oral observation: 1) presence of treated caries, 2) presence of untreated tooth decay, 3) urgency of need for treatment, and 4) presence of dental sealants on at least one permanent molar tooth (Attachment 2f).

  • Screeners record their findings either electronically using data entry tools (e.g., Epi Info, MS Access) or on a paper form (see Attachments 2a and 2f). All parents or caretakers receive the screening results (Attachment 2g). If screeners find an urgent need for dental care, they inform the school contact for follow-up. Screeners transmit the data to the state health department designated contact through a secure electronic format or through the USPS or other delivery service that protects privacy.

  • The minimal demographic data collected are grade and the school-level percentage of children eligible for NSLP, which is typically publicly available on the state DOE website. BSS recommends that states collaborate with the DOE at the state, district, or school level to collect certain child-level demographic information as well, including sex, age or date of birth (DOB), race, ethnicity, and NSLP eligibility by using DOE’s existing data. States that obtain child-level demographics from DOE include the state student ID (SSID) on the screening form. The SSID is then used at DOE to link the oral health screening data with DOE demographic data. Before sending the linked dataset to the state oral health program, DOE removes SSID from the dataset once the data linkage is complete. For states that are not able to obtain the child-level demographics from official government data, the BSS manual provides a sample parent questionnaire to collect the data (Attachment 2a).

Post-screening phase procedures:

  • State programs enter, clean and analyze the data; de-identify it; and respond to ASTDD’s annual email request for state-aggregated BSS data (Attachments 2h and 2i). To accommodate states and ensure that data are published timely, ASTDD accepts data throughout the year.

  • As data are received, and at least quarterly, ASTDD verifies the BSS data to ensure that survey design and data meet criteria specified in the BSS manual (attachment 2a) emailing the data set to CDC for publication on its OHD website.

  • CDC hosts and maintains the OHD website as a centralized public-facing platform. CDC currently displays the following BSS for children data: state prevalence estimates of caries experience with treated or untreated caries, untreated tooth decay, dental sealants, response rates, NSLP data, and a brief description of each state’s BSS method.

Estimation

The sample data are weighted by the reciprocal of the probabilities of school and child selection and adjusted for non-response. Variance estimation is based on Taylor series linearization. Estimation of variances and confidence intervals accounts for stratification and cluster sampling effects.

BSS analysis guidelines (Attachment 2c) are provided on the ASTDD website (www.astdd.org/basic-screening-survey-tool/) and provide step-by-step instructions on how to enter and clean data, prepare data for analysis, calculate weights, and generate statistical estimates to account for the BSS complex sampling design. To further facilitate state adherence to the analysis methods, the guidelines also provide examples of weight calculation, program codes from various software packages (e.g., SUDAAN, SAS, STATA, SPSS, and R) capable of analyzing complex survey data, and tables to present the estimated prevalence.

Quality control

To ensure standardization and quality of the data, BSS incorporates key procedures from various dimensions.

  • Step-by-step protocol. ASTDD in consultation with CDC provides states explicit and step-by-step guidelines (Attachments 2a–2c) that cover all phases of BSS from planning, sampling design and implementation through data analysis, reporting and submission. Since the BSS protocol was released in 1999, ASTDD in consultation with its Data and Oral Health Surveillance Committee and CDC has conducted periodic reviews to update and improve the screening process and data collection quality (Attachment 2a).

  • Technical assistance. States are encouraged to seek customized technical assistance from ASTDD. Supported by a CDC partner cooperative agreement, ASTDD provides training and technical assistance on BSS to states. ASTDD provides full-spectrum technical assistance, which ranges from planning, screener training, and sampling design to data entry, weighting, analysis, and reporting.

  • Didactic and clinical training. To ensure standardization and consistency in data collection and definition, the BSS protocol suggests that states train screeners at least once before each BSS cycle through a combination of didactic and clinical training, which includes verifying each screener’s assessment of 20 children from the target grade level (Attachment 2a).

  • Data entry and cleaning. The BSS protocol and analysis guidelines (Attachments 2a and 2c) instruct states to use double data entry and to use electronic data entry systems such as Epi Info and Microsoft Access with embedded data validation functions that allow each field to be checked automatically for valid values, inconsistencies, skip patterns, etc. Examples of data entry systems using Epi Info and Access are available from ASTDD.

  • Data analysis guidelines. To ensure that states use correct complex survey data analyses, BSS analysis guidelines (Attachment 2c) on the ASTDD website accompany step-by-step instructions with examples showing how to enter and clean data, prepare analysis data, calculate weights, and generate estimates to account for the complex sampling design.

  • Review and validation of submitted data. ASTDD reviews state-submitted BSS data to ensure they meet the criteria specified in the BSS manual (Attachment 2a): data from a statewide representative probability sample of children in the target grade or enrolled in Head Start, data weighted for the sampling scheme and non-response, calculation of variance estimates and confidence intervals to account for stratification and cluster sampling effects, and use of diagnostic criteria outlined in the BSS manual. Only data meeting the criteria are submitted to CDC for posting on the OHD website.

Data collection frequency

Conducting the BSS less frequently than every 5 years will inhibit both the quality of trend data and the likelihood that the resulting data will be analyzed and published. Less frequent BSS also will reduce the timeliness of data used to inform program and funding decisions.



B3. Methods to Maximize Response Rates and Deal with No Response

CDC estimates that approximately 34 states (67% of 50 states and Washington, DC), including the 20 states currently funded by CDC, will conduct at least one BSS for children and respond to this data collection during the period for which this approval is being sought.

Prior to 1993, no states had BSS data. CDC’s cooperative agreements with some of the 50 states have played a pivotal role in encouraging and facilitating states to conduct BSS. In 2000, only six states had ever conducted a BSS for children. By 2010 that number had increased to 43, and it now stands at 47.

The average child-level response rate for the latest BSS is 59% over the 34 states; response rates vary by state and range from 28% to 89% for K–3rd grade BSS. Low response rates associated with positive consent are a primary challenge to BSS implementation. Using passive consent typically results in response rates of 75–90%, whereas using positive consent generally reduces response to less than 50% (Attachment 2a). To address this challenge, ASTDD and CDC updated the BSS manual (Attachment 2a) in 2019: clarifying that passive consent is preferred and verbal consent is acceptable for positive consent (Attachment 2a), modifying the positive consent form to include verbal consent verification (Attachments 2a and 2e), decoupling the optional questionnaire from the positive consent form to make clear that parents are not required to fill out the questionnaire to grant consent, and emphasizing in the sample letters to schools (Attachments 2a and 2d) that the survey is supported by CDC (in funded states) and endorsed by key stakeholders.

BSS sampling allows for replacement of refusing schools. The sampling guidelines (Attachment 2b) instruct states on how to select replacement schools using the same probability methods employed in the original selection process.

In addition, CDC’s evaluation of the last cooperative agreements (State oral disease prevention program, CDC-RFA-DP13-1307) identified several strategies states used to improve BSS response rates: communicating the importance of BSS and its findings to key stakeholders such as schools and parents to enhance buy-in and participation, integrating BSS with state required hearing and vision screenings, and working with state DOE and relevant agencies to gain approval for passive consent. Moving to passive consent is particularly effective. For example, Maryland’s 2011–2012 Third Grade BSS used positive consent and had a response rate of only 15%, but that rate increased to 71% during the 2015–2016 survey when passive consent was used.1

To minimize the likelihood of missing values, the BSS manual and analysis guidelines (Attachments 2a and 2c) instruct states to review screening forms at the end of each screening day to ensure that all data boxes contain an appropriate entry and to use double data entry and data validation functions embedded in the data entry systems to confirm that data are accurate and complete.

B4. Tests of Procedures or Methods to be Undertaken

The BSS protocol (Attachment 2a) was released in 1999 by ASTDD in collaboration with the Ohio Department of Health and with technical assistance from CDC. BSS is a visual-only screening tool for reporting person-level oral health status. It was proposed to address the lack of state and local oral health surveillance data resulting from limited resources at those levels. The BSS protocol was designed to be consistent but easy to perform, take little time to conduct, and require no sophisticated equipment. It requires minimal screener instruction and allows health care providers other than dentists (e.g., hygienists, school nurses) to perform the screening, thereby addressing the challenge posed by the scarcity of dental professionals in some states.

The BSS protocol was built upon a foundation of rigorous validity testing, state field experience, and consensus of the BSS Advisory Committees.2, 3 In 1995, CDC tested the validity of the visual-only screening performed by hygienists and nurses among elementary school children in Georgia against a visual-tactile screening at tooth or surface levels performed by dentists.2 The visual-only screening method also was fielded among school- or preschool-aged children in Oregon, Washington, Louisiana, Maine, and Ohio during 1990s to ensure that the survey process was operationalized, feasible and efficient.3

Since the BSS protocol was released in 1999, ASTDD in consultation with state programs, ASTDD’s Data and Oral Health Surveillance Committee, and CDC has reviewed and updated the protocol five times: 2003, 2008, 2015, 2017, and 2019 (Attachment 2a). These revisions aimed to improve the survey process as well as data quality and standardization, reduce the burden of data collection to the public, and incorporate past state experience and feedback to meet state surveillance needs and improve data utility. For example, one area improved in the 2019 revision is to clarify that existing official data should be a primary data source for collecting child demographic data and a parent or guardian questionnaire is used for collecting child demographic data only when the state is unable to obtain the data from an official data source such as the state DOE.

CDC consulted with seven state oral health programs in January 2019 to learn about their experience and to identify challenges and benefits associated with BSS implementation. In addition, CDC’s evaluation of the last cooperative agreement (State oral disease prevention program, CDC-RFA-DP13-1307) with selected funded states helped elucidate state experience and identify areas for improvement.



B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data



Name

Organization

Role (consulted or collected/analyzed)

Kathy Phipps, (805) 776-3393, [email protected], Data and Surveillance Coordinator

ASTDD

Provide technical consultation and assistance in data collection and statistical analysis; collect and verify data from states

Michael Manz, (734) 615-8268, [email protected], Data Consultant

ASTDD

Provide technical consultation and assistance in data collection and statistical analysis; collect and verify data from states

Mei Lin, (770) 488-5109, [email protected], Epidemiologist

Division of Oral Health, CDC

Provide technical consultation in data collection and statistical analysis



References

1. Maryland Department of Health. Oral Health Survey of Maryland School Children, 2015-2016. 2017. Available: https://phpa.health.maryland.gov/oralhealth/Documents/SchoolSurvey2015.pdf. Accessed: October 15, 2019.

2. Beltran ED, Malvitz DM, Eklund SA. Validity of two methods for assessing oral health status of populations. J Public Health Dent. 1997;57(4):206-214.

3. Beltran-Aguilar ED, Malvitz DM, Lockwood SA, Rozier RG, Tomar SL. Oral health surveillance: past, present, and future challenges. J Public Health Dent. 2003;63(3):141-149.




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAmoroso, Alison (CDC/ONDIEH/NCCDPHP)
File Modified0000-00-00
File Created2021-02-04

© 2024 OMB.report | Privacy Policy