The SunWise Program
ICR # 1904.03
July 16, 2009
U. S. Environmental Protection Agency
Office of Air and Radiation
Part A of the Supporting Statement
Identification of the Information Collection
1(a) Title of Information Request
The title of this Information Collection Request (ICR) is The SunWise Program (ICR# 1904.03).
1(b) Short Characterization/Abstract
The SunWise Program was initiated in 1998 through a statutory mandate under Title IV of the Clean Air Act. The long term objective of the SunWise Program is to reduce the incidence of, and morbidity and mortality from, skin cancer, cataracts, and other UV-related health effects in the United States. Short term objectives include:1) reducing the risk of childhood overexposure to the sun by changing the knowledge, attitudes, and behaviors of elementary school children and their care givers; and 2) improving the availability of accurate, timely, and useful UV data directly to schools and communities across the United States.
The SunWise Program builds on traditional health education practices through the use of existing curricula, learning standards, scientific strategies, and evaluation mechanisms. The Program is a collaborative effort of schools, communities, health professionals, educators, environmental organizations, meteorologists, local governments, federal agencies, and others. Participating schools sponsor classroom and school activities to raise children’s awareness of stratospheric ozone depletion, UV radiation, and the largely preventable health risks from overexposure to the sun, as well as simple sun safety practices. All educators interested in participating in this partnership program are asked to register using the online form (www.epa.gov/sunwise/becoming.html and www.epa.gov/sunwise/becoming_partner.html) or a hard copy version distributed by EPA. EPA will use the information provided through this registration to maintain a database of participating schools and organizations and a mailing list for information distribution purposes. Participating schools and organizations receive a variety of materials, including a classroom “Tool Kit” of games, songs, puzzles, story books, videos, access to the internet UV intensity mapping/graphing tools, and more. The Tool Kit also includes sample sun safety policies and guidelines to help expand the sun safety message beyond the classroom.
Teachers who sign up for SunWise are asked to complete a survey at the end of program implementation. Results of these surveys are used to fine-tune existing SunWise materials and develop new ones that better meet our participants’ needs. Teachers are also asked if they are interested in administering a brief survey to their students before and after program implementation. The surveys will be mailed out to all teachers that express interest in the survey process. Survey responses are voluntary and anonymous. The results from the student surveys are used to evaluate program effectiveness and also help guide materials development.
Additionally, SunWise has developed an on-line interactive SunWise Sun Safety Certification Program. When it is up and running, the certification program will allow for students, adults, organizations, and employers to develop credentials on sun safety awareness and behaviors. In order to gauge the certification program’s effectiveness, we will be collecting information on demographics, knowledge, attitudes, intended behavior, and behavior of the certification program users. Users types include: outdoor recreation staff at camps, parks, recreation programs, sports organizations, lifeguards, etc. The on-line certification program is currently staged at: http://sunwise.staging.informationexperts.com. To log on, please use the following username and password (in Internet Explorer):
User name: iedomain\sunwise
Password: @th3d00r
The final product when approved will be available at
http://www.epa.gov/sunwise/tutorial.html.
TERMS OF CLEARANCE:
On November 2, 2001, OMB approved ICR #1904.01 with Terms of Clearance. EPA has addressed OMB’s terms in the following manner.
This collection is approved in part and disapproved in part. EPA is approved to collect registration information and to conduct the requested student survey and teacher survey. These portions of the collection are approved for two years, until November 2003. OMB understands that EPA no longer intends to collect information under the Parent Survey and the School Administrator Survey. These two surveys are not approved. OMB has adjusted the burden of the collection accordingly from EPA's request.
EPA Response: We will not be surveying parents or school administrators under this ICR renewal.
As discussed in Part B(i) of the revised supporting statement, EPA plans to assess two sampling issues in Fall 2002 based on results from the first year of the student survey: (1) whether it is necessary to include in the sampling protocol an approach to stratify between warm and cool states, and (2) whether it is necessary to include an additional sampling stage to sample classrooms within a school that has been selected for participation. EPA should report to OMB its findings with regard to these issues before beginning the second year of sampling, and advise OMB whether it intends in the second year of the collection to (1) stratify between warm and cool states in the first stage of sampling,
EPA Response (sent to OMB on 8/29/02): We have conducted this analysis and found no differences between the students in the 37 cooler states versus the 13 warm states, as categorized by the UV Index values. We measured change in scores from pretest to posttests on children's knowledge of sun protection, attitudes toward the sun, and current sun protection practices and found minimal differences in the change. Therefore, with no differences between the two groups, there would be no reason to stratify our findings.
and/or (2) add a sampling stage to sample classrooms within a selected school.
EPA Response (sent to OMB on 8/29/02): We analyzed the composition of registrants in the SunWise database by region of the country and the number of schools that contained more than one classroom. It appears that 75 percent of schools in warm and cool climates have no more than 1 classroom per school with negligible difference by region. There is little evidence of clustering or differences in clustering between regions of the country. If that were the case, we would have to account for this effect by performing tests of intraclass correlation.
OMB also notes that EPA must include the OMB number, expiration date, and Paperwork Reduction Act notice on the teacher survey before using that instrument to collect information.
EPA Response: All information is now included on the teacher survey. See attachment #1a and 1b for verification.
ICR 1904.01 expired on November 30, 2003. The first Federal Register Notice to renew this ICR as 1904.02 was published on October 21, 2003, and was approved by OMB on July 30, 2004 for a period of three years.
ICR 1904.02 expired on July 30, 2007. The first Federal Register Notice to renew this ICR as 1904.03 was published on February 12, 2007, and was submitted to OMB on September 25, 2007. ICR 1904.03 was approved in part on February 6, 2008. ICR 1904.03 will expire on February 28, 2011. The terms of clearance read:
This ICR is approved in part. The registration of educators component is approved at this time. Other components of this ICR is not approved at this time, but the agency may resubmit these components with no significant or material change, pending the results of pre-tests that will examine the readability of the survey instrument.
EPA Response: The student survey was pretested with a group of 8-12 year old students. Minor adjustments were made and the rigor in the Supporting Statement Part B was improved.
As a result of these actions, OMB approved more components of the ICR on April 15, 2008:
This ICR is approved in part. The student and the teacher/school nurse components are approved at this time. When the agency resubmits the ICR for renewal, the agency must provide a more detailed discussion on the methods (e.g., how the pre-post comparison is a proxy for a control group) and the analytic plan (e.g., model) in the Support Statement Part B. OMB notes that EPA must include the OMB number, expiration date, and the Paperwork Reduction Act notice on all survey instruments before they are fielded.
EPA Response: Part B of the Supporting Statement has been improved to address OMB’s concerns, and an additional Part B was developed for the online certification program. Additionally, the terms of clearance section of this Supporting Statement was updated to reflect the current status of the ICR.
Need for Use of the Collection
2(a) Need/Authority For The Collection
This collection will be used for program material distribution and to determine program effectiveness and participant satisfaction. Educators will be asked to fill out a simple registration form, which we use to mail out the program materials and keep track of the Program’s:
geographic reach (Which states/regions have SunWise schools?);
grade-level and subject-matter distribution (How many 1st grade teachers are using SunWise? How many math teachers are using SunWise? etc.); and
student participation level (How many students is SunWise potentially reaching?).
Surveys to be administered include:
Student survey to identify sun safety knowledge, attitudes, and behaviors among students before and after participation in the Program; and
Teacher questionnaire for measuring their receptivity to the educational component of the Program.
Embedded questions within the SunWise Sun Safety Certification Program measure receptivity to sun protection, demographic information, and current practices, attitudes and knowledge.
The data will be analyzed and results will indicate the Program’s effect on participants’ sun-protection knowledge, attitudes, and behaviors. Responses to the collection of information are voluntary. All responses to the collection of information remain anonymous and confidential.
The SunWise Sun Safety Certification Program is voluntary and will remain anonymous and confidential. No personal information will be stored. EPA will ask the user to enter their first and last names, but only for the certificate. The information will not be stored on the EPA server. Built into the certification program are a series of questions to determine outdoor recreation staff’s current sun protection knowledge, attitudes, behavioral intentions and behaviors. Additionally, there are questions that try to determine the environmental conditions (i.e., policies) already in place to minimize UV damage to the staff and visitors, and the perceived cultural norms of camp staff. Specifically, the broad questions include:
Who is using the certification program?
How long does the certification program take?
How well are users learning from the certification program?
What are the current knowledge, attitudes and behaviors of outdoor recreation staff taking this certification program?
Does the certification program use a tone, style and examples that resonate with the audience?
What resonates with different age groups?
Does the audience perceive sun protection as important?
What are outdoor recreation facilities doing currently to prevent overexposure to UV?
What are their peers doing to protect themselves from the sun? Are peers concerned about the sun’s damage?
What drives their behavioral intentions?
The knowledge gained through this information collection will inform program decisions. Specifically, the decisions include:
Are additional resources needed for outdoor recreation facilities?
Does the certification program need to be revised to better resonate with the audience?
SunWise is considering developing materials for pre-service teachers, who are usually in their twenties and may be similar in age to many young outdoor recreation staff. This information will help inform framing of sun safety messages for this group.
This is one of the first on-line products offered by EPA to promote sun protection. Is the on-line environment an effective way to disseminate information to target audiences? Is the current certification program format an effective way to disseminate information to this target audience?
The SunWise Program recognizes the challenge of measuring the progress and evaluating the effectiveness of an environmental and public health education program where the ultimate goal is to reduce risk and improve public health. Therefore, the continual and careful evaluation of program effectiveness through a variety of means, including data from pre- and post-intervention surveys, tracking and monitoring of classroom activities and school policies, and advisory board meetings, is necessary to monitor progress and refine the program.
2(b) Practical Utility/Users of the Data
EPA/SunWise will use the survey results to evaluate program effectiveness—including cost effectiveness—and adapt as appropriate its messages, approaches, and materials. Survey results will enable EPA/SunWise to better meet the needs of its educator and student participants, with the long range goal of reducing the incidence and effects of skin cancer and other UV-related health problems among children and adults.
3. Nonduplication, Consultations, and Other Collection Criteria
3(a) Nonduplication
The information required to complete the survey for The SunWise Program is not duplicative of information otherwise available to EPA. In the early stages of the SunWise Program’s development in 1997, several searches for information were completed in consult with external stakeholders, including representatives from the following organizations:
American Academy of Dermatology
American Cancer Society
Boston University Medical Center - Skin Oncology, Cancer Prevention & Control Center
Centers for Disease Control and Prevention
National Association of Physicians for the Environment
National Safety Council
The Skin Cancer Foundation
Results from these consultations indicated that no other formal, student-focused, sun safety programs were being implemented in the United States, nor were surveys being conducted on attitudes and practices of children relating to sun exposure.
To EPA/SunWise’s knowledge, there is no other sun safety certification available to outdoor recreation staff in the US, therefore it is essential that accurate information on the users and use of the certification program be collected for program refinement.
3(b) Public Notice Required Prior to ICR Submission to OMB
Official notice of this proposed collection appeared in the Federal Register on Friday, May 21, 2003 (68 FR 27796). No comments were received.
3(c) Consultations
The following professionals were consulted during the development of the three survey instruments:
Alan Geller, Boston Medical Center, Skin Oncology Cancer Prevention & Control Center, (617) 638-7126
Dave Buller, PhD, AMC Cancer Research Center, (303) 239-3511
Dr. Barbara Gilchrest, Chair, Department of Dermatology, Boston University School of Medicine, (617) 638-5538
Dr. Donald Miller, Assistant Professor of Epidemiology and Health Policy, Boston University School of Medicine, (781) 687-2865
Dr. Amy Mack, Psy.D., ICF International, (703) 219-4311
3 (d) Effects of Less Frequent Collection
SunWise depends on registration information to
maintain an accurate list of participants; and
ensure timely distribution of program materials and program updates to participants.
SunWise depends on survey responses to
help guide program development;
measure participant satisfaction with the program; and
elicit basic information on attitudes and practices of children and their caregivers relating to sun exposure.
SunWise depends on certification program information to
determine the current knowledge, attitudes, behavioral intentions and behaviors of individuals taking the certification program;
determine which certification program to provide to the user;
measure how many and what type of users are becoming certified;
ensure the certification program does not take too long to complete;
ensure rigor and enthusiasm for the certification program; and
determine whether or not the certification program is delivering the information in an easy-to-understand manner.
Conducting the surveys and information collection less frequently may slow down the Program’s ability to institute participants’ desired changes.
3(e) General Guidelines
All OMB guidelines will be adhered to by EPA/SunWise Program.
3(f) Confidentiality
Names of participating schools and organizations may be made public. All names of registered educators and other participating individuals will remain confidential. All surveys are completed on an anonymous basis (no identifying information is included on the survey forms), and responses to the collection of survey information will remain anonymous and confidential. A contractor will analyze survey results and, thereafter, return all completed surveys to EPA. For the certification program, users’ first initial and last name will not be collected, therefore responses are anonymous and completely unlinked to identifying information. EPA will make public how many individuals have taken and completed the certification program. Additionally, EPA or a contractor will analyze the certification program responses to enhance the certification program and create new materials for that audience if needed.
3(g) Sensitive Questions
The survey instruments of this ICR contain no sensitive questions.
4. The Respondents and the Information Requested
4(a) Respondent/SIC and NAICS Codes
Entities potentially affected by this action are elementary, middle, and high school students and educators (SIC Div. I: Group 8211, NAICS code: 61111).
4(b) Information Requested
The registration form (Attachment 1a and 1b, also available at www.epa.gov/sunwise/becoming.html and www.epa.gov/sunwise/becoming_partner.html) is a simple, 10-minute questionnaire that asks teachers to provide: the name and contact information of the participating school; school composition (e.g. grade levels); and information specific to the interest areas of the registering teacher. The purpose of this form is to ensure that EPA distributes the most relevant education materials to all SunWise participants.
The survey instruments covered under this ICR are as follows:
The SunWise Student Survey (Attachment 2): This survey will be administered to participating students before and after implementation of SunWise activities. Pre-test and post-test surveys are nearly identical in content, with the exception of one question in the post-test which is aimed at verifying that the student has participated in SunWise. This simple, 10-minute questionnaire elicits basic information on knowledge, attitudes, and practices of children relating to sun exposure. The survey was derived from an earlier SunWise instrument, which was previously approved by OMB (Control No. 2060-0439). After using the same SunWise student survey for nearly a decade, EPA has taken a fresh look at the survey to ensure that it is the best tool for measuring the effectiveness of the SunWise program. The survey was reviewed by skin cancer, educational, child development, and survey experts, and revisions included removing questions that were no longer considered relevant or important, adding new questions that addressed sun exposure and sun safety knowledge and attitudes, and rewording questions to make them more understandable for children. The survey instrument has been pilot tested with 9 students aged 9 to 12 years old, and time tested to ensure completion within 10 minutes. The pilot testing focused on the readability and understandability of the questions and possible responses; following the pilot test, the survey was revised to: (1) include instructions for students to turn over the two-page, double-sided survey; (2) increase the font of multiple choice instructions; (3) put all questions referring to “last summer” together in a box at the end of the survey; (4) delete one question that students found difficult; (5) revise the wording of several questions to clarify question meaning; (6) add a new response choice for why students do not wear sunscreen; and (7) increase the response scale for several questions from a three-point to a five-point scale. (Please note that during discussions with OMB for the initial ICR approval, it was determined that an abbreviated burden statement would be appropriate for this survey group.)
Teacher Evaluation of Classroom Activities (Attachment 3): Teachers will be asked to evaluate student receptivity to sun safety lessons and Internet learning. Teacher feedback about the usefulness of classroom and school materials will be vital to the refinement of sun safety education materials. This information can also be submitted online. (2 pages)
SunWise Sun Safety Certification Questions (Attachment 4): Certification program users will be asked to provide their first and last name so they can be given a certificate of completion with their name on it. The information will not be collected by EPA. Additionally, users will be asked a series of questions to determine their current sun protection knowledge, attitudes and behaviors, and their receptivity to sun protection generally. The time to enter this information will be minimal at 7 minutes. Additionally, the questions being asked will also help educate the user by reminding them of their own behavior in comparison to the desired behavior (practicing sun safety). Part B(ii) of the Supporting Statement provides additional information on the certification program survey instrument design.
Registration forms can be submitted electronically or in hard copy form using envelopes provided by EPA. Likewise, teachers are given postage-paid envelopes in which to return their completed surveys, along with the surveys returned to them by the students and their parents. The teacher survey is also available electronically. Neither the registration nor the surveys require that respondents keep records, make photocopies, or maintain files.
5. The Information Collected
5(a) Agency Activities
The Agency activities associated with registration of participants done through the
SunWise Program consists of the following:
Maintain participant database;
Maintain mailing list for information distribution purposes.
The Agency activities associated with surveying done through the SunWise Program consists of the following:
Develop collection instruments;
Answer respondent questions;
Audit and/or review data submissions;
Reformat and distribute the data;
Store the data.
The Agency activities associated with the certification program done through the SunWise Program consists of the following:
Develop certification program;
Store and consolidate data, none of which is sensitive or personal;
Review consolidated data;
Refine certification program and other materials accordingly.
5(b) Collection Methodology and Management
In collecting and analyzing the information associated with this ICR, EPA will use electronic and hard-copy registration forms, mailed and electronic surveys, and electronic records of certification program inputs. EPA plans to review the efficiency of using an electronic bulletin board or the Internet to facilitate the transfer of information between EPA and potential or active program participants.
Further details on the collection methodology and management for the surveying and the certification program done through the SunWise Program are provided below.
Surveying
EPA routinely promotes the SunWise Program
through presentations and exhibits at meetings of nurses and
teachers. Registrants provide their name and contact information,
including the name of their school, and state whether they are a
classroom teacher, health teacher, gym teacher, or school nurse. To
determine changes in student sun safety behaviors from one summer to
the next, EPA will recruit faculty that have the greatest likelihood
of having almost all of their students for two consecutive school
years: health teachers, nurses, and physical education teachers.
EPA will send a recruitment letter to health and physical education
teachers and nurses in schools in all 50 states that have recently
registered for the Program (e.g., within the last year), or
approximately 1,000 schools. Interested faculty will agree to take
part in the study. It is expected that 30 of these 1,000 recruited
schools will eventually complete pre-tests and post-tests.
Schools
will be randomly assigned to teaching SunWise in the spring or a
delayed intervention in the spring of the following year. The survey
process is exactly the same for schools in the intervention and
delayed intervention group. Schools will be matched with random
assignment to the intervention or delayed intervention group.
Schools will be matched based on (1) information on the number of
students each school plans to teach (obtained through the
recruitment letter) and (2) the socioeconomic status (SES) of the
school (determined through census data from the census tract in
which the school is located).
These variables will be
used to balance schools by SES and ensure rough parity in the number
of children in the intervention and delayed intervention groups.
Schools will be first placed into one of four groups: 1) higher
number of students participating (75+) and higher SES, 2) lower
number of students participating (<75) and higher SES, 3) higher
number of students participating and lower SES, and 4) lower number
of students participating and lower SES. For each of these four
groupings, schools will then be randomized into either the
intervention group or delayed intervention group. Within a month of
receipt of their agreement to participate, we will notify schools of
their study assignment to either an intervention or delayed
intervention school.
Teachers in the intervention schools will provide children
with a double-sided, one page anonymous survey instrument. After
students complete the pre-test in the spring, teachers will lead the
SunWise lessons. Teachers from schools that mail back completed
pre-tests and acknowledge that they have taught SunWise lessons will
be mailed post-tests for completion in spring of the following
year.
Teachers in the delayed intervention schools will
provide children with a double-sided, one page anonymous survey
instrument. Students will complete the pre-test in the spring.
Schools that complete pre-tests will be mailed post-tests for
completion in spring of the following year. Upon completion of the
post-test survey, teachers in the delayed intervention schools will
lead the SunWise lessons.
In each survey cycle, children in intervention and delayed intervention schools will complete only two surveys, each taking 10 minutes to complete. Their participation in the SunWise survey process ends with completion of the second survey. If children require any help with reading the survey, teachers will assist them.
All surveys are anonymous. Since surveys are anonymous, no specific information on the child can be reported to parents or school staff. Surveys are done in the classroom setting, and conducted by the teachers; thus, it would not be feasible for the teachers to obtain consent from the parents and assent from the children for a classroom teaching tool.
EPA will provide self-addressed, stamped envelopes to respondents.
EPA’s contractor (ICF International, with its subcontractor Boston University Schools of Medicine and Public Health) will make a one-time payment of $150.00 for each school, contingent upon receipt of the second set of surveys (i.e., post-tests). UV-sensitive bracelets will also be provided to participating students.
EPA will ensure the accuracy and completeness of collected information by having all surveys reviewed by the contractor. An annual statistical report in consultation with the contractor will be developed.
Certification
EPA will collect information as participants take the certification program. Many of the questions will help instill the information they are learning through the certification program. Part B(ii) of the Supporting Statement presents more detailed information on the data collection, management, and analysis methods for the certification program.
EPA plans to periodically review data collected from the certification program and make refinements to the program as necessary. The knowledge gained through this information collection will inform programmatic decisions, and allow EPA to gain a better understanding of the target audience to determine if additional intervention is needed in the outdoor recreation setting. While results cannot be generalized to the general outdoor recreation staff population due to the limitation of self-selection, the information will be informative and will be shared with partners and the public for improved tailoring of interventions to the outdoor recreation audience.
5(c) Small Entity Flexibility
Not applicable.
5(d) Collection Schedule
All educators are required to register for the Program if they wish to receive the Tool Kit (www.epa.gov/sunwise/becoming.html and www.epa.gov/sunwise/becoming_partner.html) and regular program updates.
Student surveying is done using random sampling. In year 1, the SunWise partner schools will administer the student pre-test of Group A for surveying during the spring. In year 2, a post-test of Group A will be administered during the spring, and a pre-test for Group B will also be administered to students in the spring. Finally, in the spring of year 3, SunWise partner schools will administer the final post-test to Group B. The teacher surveys are administered once during the school year, and are done on a voluntary basis, thus all educators registered in SunWise have the opportunity to complete a survey. No specific completion dates are given to the respondents. Respondents are requested to submit the surveys during the school year in which they were administered. Data analysis will occur during the summer months following survey administration post-program implementation, with results available by October of each year.
All educators and students are required to provide their first initial, last name and user type in order to take part in the on-line certification program. Because participation in the certification program is voluntary, self-selection is the only feasible sampling method available to the Agency. For more information on sample selection for the certificate program, please see Part B(ii) of the Supporting Statement. Data will be collected on an ongoing basis for a period of one year, and then the need for questions in the certification program will be reassessed.
6. Estimating the Burden and Cost of the Collection
6(a) Estimating Respondent Burden
EPA developed the SunWise Program Registration Form with the Agency’s Internet Support Team in Research Triangle Park, North Carolina. Input from a five-person focus group was used to determine average completion time. Teachers are asked to complete the registration form only once during their participation in the program for a total registrant burden of 10 minutes.
Annual estimated respondent burden:
Annual Respondent Burden- Registration |
|
Registrant Group |
Hour Burden |
Educator |
0.17 |
During the development phase of the surveys, EPA, in consult with the contractor, conducted a pilot survey with 9 respondents to determine appropriate content and survey completion time. The student survey will be administered once in years 1 and 3 (i.e., pre-test for Group A and post-test for Group B) and twice in year 2 (i.e., post-test from Group A and pre-test for Group B). Each survey will take approximately 10 minutes to complete, for an annual per student burden of 10 minutes. The teacher survey is administered one time each year and takes approximately 20 minutes to complete.
Annual estimated respondent burden:
Annual Respondent Burden- Surveys |
|
Survey Group |
Hour Burden |
Student |
0.17 |
Educator |
0.33 |
Users will be asked a series of questions about sun protection to determine their demographics (no personal identifiers will be captured), knowledge, attitudes, behavior, perception of others they work with, and the environmental conditions in the place where they work. They will also be asked to enter their first and last names so it can be put into a certificate that they can print. This information will not be saved by EPA. The total registrant burden is estimated to be 7 minutes.
Annual estimated respondent burden:
Annual Respondent Burden –Certification Program |
|
Survey Group |
Hour Burden |
Student |
0.12 |
Educator |
0.12 |
6(b) Estimating Respondents Costs
The Bureau of Labor Statistics figures (http://stats.bls.gov/news.release/ecec.t02.htm) were used to determine labor costs for these tables. In order to account for benefits and overhead, the average hourly wage rate of $37.08 for a teacher was increased by 110% for a labor cost of $77.87 per hour for teachers.
Annual Respondent Burden and Cost- Registration |
|||
Registrant Group |
# of responses per participant |
Hour Burden |
Labor Cost |
Educator |
1 |
0.17 |
0.17 * $77.87 = $13.24 |
Annual Respondent Burden and Cost - Surveys |
|||
Survey Group |
# of responses per participant |
Hour Burden |
Labor Cost |
Student |
1 |
0.17 |
1(0.17 * 0) = 0 |
Educator |
1 |
0.33 |
0.33 * $77.87 = $25.70 |
Annual Respondent Burden and Cost – Certification Program |
|||
Survey Group |
# of responses per participant |
Hour Burden |
Labor Cost |
Student |
1 |
0.12 |
1(0.12 * 0) = 0 |
Educator |
1 |
0.12 |
0.12 * $77.87 = $9.34 |
The respondents will have no capital/startup or O&M costs.
6(c) Estimating Agency Burden and Cost
Registration information collection is done primarily through a website database feature. The start-up cost associated with designing the registration web page was approximately $25,000. Maintenance of the website is estimated to take 4 hours/month or 48 hours per year and will be conducted by in-house EPA employees. The cost of this labor is calculated based on a GS 13 Step 1 pay level ($52.61/hour using the salary associated with this grade and step, multiplied by a benefits factor of 1.6), making the total annual cost $2,525.18. Finally, EPA will manually input all information received via hard-copy registration form onto the database. The costs of this labor are estimated to be 960 hours per year at a Senior Environmental Employee (SEE) pay level three ($15.48/hour using the salary associated with this level, $11.06, multiplied by a benefits factor of 1.4). Total hours (960) multiplied by $15.48 per hour amounts to a total agency grantee cost of $14,860.80 per annum.
Agency Burden and Costs - Registration |
||
|
Burden Hours |
Total Costs ($) |
EPA (Annual) |
1,008 |
$17,386.08 |
EPA (3-Year ICR) |
3,024 |
$52,158.24 |
The contractor (ICF International, with its subcontractor Boston University Schools of Medicine and Public Health) assists EPA in data collection and analysis. The contractor also provided technical support in the development of the surveys. To perform these functions, EPA has contracted for a total of 1262 professional hours over a two-year period. At an average rate of $79.12 per hour, the total cost for the contractor is about $49,925 annually. Agency burden to manage this contract is estimated at 4 hours/month or 48 hours annually. The cost of this labor will be calculated based on a GS 13 Step 1 pay level ($52.61/hour using the salary associated with this grade and step, multiplied by a benefits factor of 1.6). Total hours (48) multiplied by $52.61 per hour amounts to a total agency labor cost of $2,525.18/per annum.
Agency Burden and Costs- Surveying |
||
|
Burden Hours |
Total Costs ($) |
EPA (Annual) |
679 |
$52,450.00 |
EPA (3-Year ICR) |
2037 |
$157,350.00 |
The contractor (Information Experts) will create and maintain the certification program, including the data collection component. To perform this function, EPA has contracted for a total of 410 professional hours. At an average rate of $130.00 per hour, the total cost for the contractor is $53,300 for startup costs in the first year, and 100 hours per year at a cost of $13,000, for maintenance each year after. Agency burden to manage this contract is estimated at 4 hours/month or 48 hours annually. The cost of this labor will be calculated based on a GS 13 Step 1 pay level ($52.61/hour using the salary associated with this grade and step, multiplied by a benefits factor of 1.6). Total hours (48) multiplied by $52.61 per hour amounts to a total agency labor cost of $2,525.18/per annum.
Agency Burden and Costs – Certification Program |
||
|
Burden Hours |
Total Costs ($) |
EPA (Annual) – Year 1 |
458 |
$55,825.28 |
EPA (Annual) - Years 2 and 3 |
148 |
$15,525.28 |
EPA (3-Year ICR) |
754 |
$86,875.84 |
6(d) Estimating the Respondent Universe and Total Burden Costs
Registration
(A) Number to register |
(B) Total Hours |
(C) Rate per hour ($) |
(D) # of responses |
(E) Total Cost E=B*C |
3,500 Educators |
595 |
$77.87 |
3,500 |
$46,332.65 |
Total (Annual) |
595 |
|
3,500 |
$46,332.65 |
ICR Total (3 years) |
1,785 |
|
10,500 |
$138,997.95 |
Student and Teacher Surveys
(A) Number to be surveyed |
(B) Total Hours |
(C) Rate per hour ($) |
(D) # of responses |
(E) Total Cost E=B*C |
3,000 Students in Years 1 and 3 |
500 |
0 |
3,000 |
0 |
6,000 Students in Year 2 |
1,000 |
0 |
6,000 |
0 |
1,000 Educators |
333 |
$77.87 |
1,000 |
$25,930.71 |
Average Total (Annual) |
1,083 |
|
5,500 |
$25,930.71 |
ICR Total (3 years) |
2,999 |
|
15,000 |
$77,792.13 |
Certification Program Questions
(A) Number to Take Certification Program |
(B) Total Hours |
(C) Rate per hour ($) |
(D) # of responses |
(E) Total Cost E=B*C |
100 Students |
12 |
0 |
100 |
0 |
1000 Educators |
120 |
$77.87 |
1,000 |
$9,344.40 |
Total (Annual) |
132 |
|
1,100 |
$9,344.40 |
ICR Total (3 years) |
396 |
|
3,300 |
$28,033.20 |
Total
ICR Total-Registration + Surveys + Certification Program (average annual) |
1,810 |
|
10,100 |
$81,607.76 |
ICR Total-Registration + Surveys + Certification program (3 years) |
5,180 |
|
28,800 |
$244,823.28 |
6(e)Bottom Line Burden Hours and Cost Tables
Bottom Line Burden and Costs (3-Year ICR) |
||
|
Burden Hours |
Total Costs ($) |
Students |
2,036 |
0 |
Educators |
3,144 |
$244,823.28 |
EPA |
5,815 |
$296,384.08 |
Subtotal (respondents) |
5,180 |
$244,823.28 |
Subtotal (government) |
5,815 |
$296,384.08 |
Total |
10,995 |
$541,207.36 |
Bottom Line Burden and Costs (Average Annual) |
||
|
Burden Hours |
Total Costs ($) |
Students |
679 |
0 |
Educators |
1,048 |
$81,607.76 |
EPA |
1,938 |
$98,794.69 |
Subtotal (respondents) |
1,727 |
$81,607.76 |
Subtotal (government) |
1,938 |
$98,794.69 |
Total |
3,665 |
$180,402.45 |
6(f) Reasons for Change in Burden
Hours were added for the new certification program. More hours for EPA are also anticipated for the survey work due to an increased level of sophistication in the analysis and the registration of Educators. Combining that with inflation, and the switch to a new contractor, the total burden has increased for EPA. Additionally, due to the increase in the amount of time between student survey pre- and posttests (one year now as opposed to 1-3 months), the student burden hours have decreased. As a result, there will be two pre- and posttest sets, instead of three sets like that of previously approved SunWise ICRs.
6(g) Burden Statement
The annual public reporting and record keeping burden for this collection of information is estimated to average 10 minutes per response for the registration, 10 minutes per response for the student survey and 20 minutes per response for the teacher survey, and 7 minutes per response for the certification program. Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, disclose, or provide information to or for a Federal agency. This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information; processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The OMB control numbers for EPA’s regulations are listed in 40 CFR Part 9 and 48 CFR Chapter 15.
Part B(i) of the Supporting Statement
SECTION I – SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER PRELIMINARIES
1(a) Survey Objectives
EPA’s SunWise Program provides sun protection education via a standardized curriculum to school children in grades K-8 in public, parochial, and charter schools. More than 14,000 schools and 500,000 children have received SunWise education. EPA proposes to test the effectiveness of the program via anonymous surveys of children exposed and not exposed to the SunWise curriculum. The student survey will identify sun safety knowledge, attitudes, and behaviors among students before and after participation in the Program; the data will be analyzed and results will indicate the Program’s effect on these outcomes.
1(b) Key Variables
Key variables considered in this survey are the age of the students being surveyed, as well as their geographical location, depending upon whether they live in a cooler or a warmer climate. In addition, an important variable that could influence the survey results is whether students have received sun safety education in the past, including through the SunWise Program. A question in the survey asks students whether they have received sun safety education previously.
1(c) Statistical Approach
The primary objective in conducting the SunWise student survey is to measure the effectiveness of the program in changing students’ sun safety knowledge, practices, and attitudes. It is not practical to survey every student that participates in the SunWise Program, however, so EPA is surveying a subset of students in a way that allows the Agency to draw conclusions about the group as a whole from the responses received. Anecdotal information is not sufficient for this purpose, and thus EPA has chosen a statistical approach for this surveying.
The survey asks a series of questions about students’ sun safety knowledge, attitudes, and behaviors. EPA intends to survey schools multiple times during the period for which this ICR is in effect: twice (pre-test and post-test) per survey cycle. An analysis of the first set of survey results (pre-test) will give an indication of students’ sun safety practices at baseline, and the second set of survey results (post-test) can be compared to baseline to draw statistically-valid conclusions about the change in students’ sun safety practices during the period between pre-test and post-test surveys.
1(d) Feasibility
EPA has reviewed the administrative procedures necessary to conduct the SunWise student survey and has determined that it is feasible to continue with the survey. The survey was reviewed by child education and survey specialists to ensure that the questions asked will reveal sufficient information to evaluate the effectiveness of the SunWise Program. In addition, EPA has funding to conduct the survey and provide the necessary and statistical analysis of the resulting data.
SECTION II – SURVEY DESIGN
2(a) Target Population and Coverage
The sample population will be chosen from school faculty that have registered for the SunWise Program in the previous year; these faculty span the United States.
2(b) Sample Design
School faculty register for the SunWise program through EPA. Registrants provide their name and contact information, including the name of their school, and state whether they are a classroom teacher, health teacher, gym teacher, or school nurse. Recruitment letters will be sent to health teachers, nurses, and gym teachers, since they are the most likely to have access to almost all of the same students for two consecutive school years. However, some students will not participate in the posttest due to moving to another school or because they did not participate in the pretest. The sample frame includes all school faculty that have registered for the SunWise Program in the previous year.
2(b)ii Sample Size
EPA anticipates distributing a survey to 60 school faculty (representing approximately 3000 students). Although the great majority of students will be the same in both years, there is no feasible way to identify which pre- and post-tests were completed by the same student. Therefore, each completion is treated as separate for the purpose of determining sample size. It is expected that students in 30 schools will complete pretests and posttests. The number of students completing pretests and posttests per school will range from 1 classroom per grade (n=25 students) to as many as four classes (n=100 students). An average is estimated of 2 classes/school x 25 students/class x 30 schools x 2 years = 3000. This sample size is sufficient to test the primary and secondary outcomes.
2(b)iii Stratification Variables
None.
2(b)iv Sampling Method
Recruitment letters will be mailed to gym teachers, health teachers, and school nurses that have registered for the SunWise Program in the previous year. Those schools that agree to participate will be matched with random assignment to the intervention or delayed intervention group. Schools will be matched based on (1) information on the number of students each school plans to teach (obtained through the recruitment letter) and (2) the socioeconomic status (SES) of the school (determined through census data from the census tract in which the school is located). These variables will be used to balance schools by SES and ensure rough parity in the number of children in the intervention and delayed intervention groups. Schools will be first placed into one of four groups: 1) higher number of students participating (75+) and higher SES, 2) lower number of students participating (<75) and higher SES, 3) higher number of students participating and lower SES, and 4) lower number of students participating and lower SES. For each of these four groupings, schools will then be randomized into either the intervention group or delayed intervention group. Within a month of receipt of their agreement to participate, we will notify schools of their study assignment to either an intervention or delayed intervention school.
Inclusion criteria: Child must attend the school to which surveys are sent; child must be in school on the day of completion of surveys; child must be between the ages of 8 -14 and English-speaking. All teachers or nurses who contact the EPA are eligible.
Exclusion criteria: EPA will exclude anyone who is not in school on the day of completion, who does not read in English, who is younger than 8 years old, and who is older than 14 years of age.
2(b)v Multi-Stage Sampling
This will be a between-group comparison of pre-post test changes in two cross-sectional surveys completed at 12-month intervals. The comparison will test whether SunWise lessons impact on children's rates of sunburning and sun protection practices.
2(c) Precision Requirements
2(c)i Precision Targets
With the proposed sample size, the primary outcome (changes in sunburning) can be detected with >90% power. The secondary outcome (changes in sun protection habits) can be detected with >95% power.
2(c)ii Nonsampling Error
It is expected that the largest nonsampling error will be the result of nonresponse; a sufficiently large initial sample size has been selected to allow for 50% nonresponse, while still achieving the desired precision.
2(d) Questionnaire Design
The survey was derived from a SunWise instrument previously approved by OMB on November 2, 2001 (ICR #1904.01). The survey was updated based on pilot testing with nine children aged 9 to 12 years old.
SECTION III – PRETESTS AND PILOT TESTS
To pilot test the SunWise student survey, a representative school was selected in the District of Columbia School District, and from that school, nine students aged 9 to 12 years old were randomly selected from among those that completed permission slips. All nine students completed the survey and then participated in an interview with two staff from EPA’s contractor ICF International. All nine pretest respondents found the survey to be relatively easy to understand and complete. In addition, the time it took for each respondent to locate the data and complete the survey (less than 10 minutes) was not considered to be overly burdensome.
The pilot testing focused on the readability and understandability of the questions and possible responses; following the pilot test, the survey was revised to: (1) include instructions for students to turn over the two-page, double-sided survey; (2) increase the font of multiple choice instructions; (3) put all questions referring to “last summer” together in a box at the end of the survey; (4) delete one question that students found difficult; (5) revise the wording of several questions to clarify question meaning; (6) add a new response choice for why students do not wear sunscreen; and (7) increase the response scale for several questions from a three-point to a five-point scale.
SECTION IV – COLLECTION METHODS AND FOLLOW-UP
4(a) Collection Methods
All surveys are anonymous. Surveys are administered in the classroom setting, and conducted by the teachers; thus, it would not be feasible for the teachers to obtain consent from the parents and assent from the children for a classroom teaching tool. The Boston University and ICF/Caliber Institutional Review Boards have both approved waiving the requirements for informed consent. Waiving informed consent presents no more than minimal risk to study participants because all surveys are anonymous, and thus no specific information on a child can be reported to parents or school staff.
Survey participants will respond by returning their completed questionnaire through the U.S. Postal Service using a self-addressed, postage-paid envelope supplied by EPA’s contractor in the survey package.
4(b) Survey Response and Follow-Up
The target response rate is approximately 50 percent. Actual response rate will be measured based on the number of schools that submit surveys divided by the number of schools that received surveys. Follow-up emails and telephone calls will be made to all survey recipients who do not submit surveys before the end of the school year (mid-June). These telephone calls will explain the importance of the survey and strongly encourage recipients to submit their surveys. Each survey form will be assigned an identifying number to facilitate the tracking of responses from each school, in the event that teachers do not submit surveys in the provided self-addressed envelopes.
SECTION V – ANALYZING AND REPORTING SURVEY RESULTS
5(a) Data Preparation
All survey data will be entered into a database, including surveys with questions that have not been completed. A double-entry protocol will be observed throughout data entry to ensure accuracy.
5(b) Analysis
The data obtained through this survey will be aggregated and analyzed for the purpose of determining changes in students’ knowledge, attitudes, practices, and intended practice. Many of the same students will complete surveys in consecutive school years, others will complete surveys in only one of the two school years. Power calculations will be based on 750 students in the intervention schools and 750 students in the delayed intervention schools completing surveys.
As students go from one grade to the next, some students will leave the school and others will attend one of the study schools for the first time. Also, students could be absent on one of the days that surveys are being conducted. In the past, the schools have asked evaluators not to exclude children from participating in surveys. Therefore, all students will be allowed to complete post-test surveys, but a check-off box will be used to ask them if they completed the survey the year before. Using the check-box responses, a secondary analysis can be restricted to students who completed both the pre-test and post-test. Teaching will be encouraged in all grades for intervention schools but will avoid substantial drop-outs by asking teachers not to survey students during the pre-test period if they are in the graduating grade of their school since they will no longer be in the same school the following year.
Since the surveys are anonymous, it is not possible to link individual students’ answers from both surveys. Therefore, the measure of the effect of the program is not the change in individual student responses but rather the difference between the pretest and the posttest in the intervention vs. delayed intervention schools. There is also an aging effect but this will be the same in both intervention and delayed intervention schools.
Primary outcome: change in sunburning. Students will be asked if they received sunburns during the past summer and will be given the options of none, 1-2, or 3+. With 30 schools and an estimated 750 children in the intervention and 750 children in the delayed intervention schools completing surveys, there is >90% power to detect a difference in change from 60% of children reporting at least one sunburn (in both conditions at pre-test ) to 50% sunburning in intervention schools (at post-test) while remaining at 60% in delayed intervention schools.
Secondary outcome: change in sun protection habits, including hats, sunscreen, long-sleeve shirts, and sunglasses. For each of these 4 sun protection variables, we expect that 20% of children will report routine practice for each of these variables. There is >95% power to detect a change from 20% to 30% from pre- to post-test in intervention schools, while remaining at 20% in delayed intervention schools.
EPA’s contractor will conduct the same analyses for both primary and secondary outcomes. First, EPA’s contractor will calculate separately for intervention and delayed intervention schools the difference and the standard error of the difference between pretest and posttest in the proportion of students reporting each outcome. EPA’s contractor will then calculate the difference (+ 95% confidence intervals) in the size of pre-post changes between the two groups of schools and conduct a Z-test (alpha=.05) of this difference as a measure of the effect of the intervention. If there are differences between the intervention and delayed intervention schools in the age distribution of students, EPA’s contractor will age-standardize all proportions using the overall student age distribution as the standard.
Because schools are the primary sampling unit, students within a school may have responses more similar to one another than expected for a random group of students. To the extent that this clustering exists, standard analyses would produce a standard error that underestimated the true variability. Therefore, EPA’s contractor will use statistical procedures (e.g., SAS PROC SURVEYMEANS) specifically designed to accommodate such a sampling design.
5(c) Reporting Results
The results of the survey will be written up in a summary report, which may also be prepared for publication in a peer-review academic journal. The raw survey data will be maintained by EPA’s contractor and will remain unavailable to the public.
Part B(ii) of the Supporting Statement
SECTION I – SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER PRELIMINARIES
1(a) Survey Objectives
To expand the SunWise Program beyond the formal classroom, EPA has developed an online certification program and questions to educate outdoor recreation staff that supervise teens and pre-teens about the importance of sun safety. Outdoor recreation staff that supervise teens and pre-teens include camp counselors, swim instructors, and parks and recreation staff/educators. By way of this voluntary certification program, EPA looks to educate outdoor recreation staffers that supervise teens and pre-teens about the importance of sun safety for them and the youth in their care. This can be done through a number of ways including smart-scheduling, positive role-modeling and other policy changes at individual camps, pools or centers.
Upon completion, each user will be able to print out his/her certificate of accomplishment for his/her supervisor. Whole camps will then be able to state that their staff has taken the EPA Sun Safety Certification, meaning they understand the importance of sun safety, and know how to prevent sun damage in children.
A series of survey questions are included throughout the certification program and must be answered to advance through and complete the certification program. EPA intends to meet two main objectives through asking the survey questions. The primary objective is to determine the current sun protection knowledge, attitudes, practices, intended practices, and teaching habits of outdoor educators, as well as basic demographic information (e.g. age, gender, education). Responses to the certification questions will also determine the environmental conditions (i.e., policies) already in place to minimize UV damage to the staff and visitors, and the perceived cultural norms of camp staff. This information will help inform SunWise program decisions such as framing sun safety messages and developing materials for pre-service teachers (as they are close in age to the camp/outdoor recreation staff) and outdoor educators to promote sun protection outside of the formal classroom.
The secondary objective is to gather feedback on the usefulness of the certification program. This feedback will help the Agency determine if online media is an effective method of teaching sun safety lessons, and whether additional resources are needed for outdoor recreation facilities.
1(b) Key Variables
Key variables considered in this certification program are demographic information about the educator (e.g., age, gender, education), the age range of children being supervised by the educator, the educator’s sun safety practices and intended practices, and policies related to sun safety implemented by the educator’s program or organization.
1(c) Statistical Approach
The primary objective of the certification program questions is to measure the sun protection attitudes, practices, intended practices, and teaching habits of outdoor educators. Every outdoor educator that participates in the certification program will be asked to respond to the questions that are built into the online program. It is not practical, however, to require all outdoor educators to participate in the certification program, so participation will be voluntary and self-selecting. From this group, the Agency will not be able to draw conclusions about outdoor educators in general, but will be able to gain some insight into this target audience. Anecdotal information is not sufficient for this purpose, and thus EPA has chosen a standardized questionnaire-based approach for this survey.
The certification program asks a series of questions about outdoor educators’ sun protection attitudes, practices, intended practices, and teaching habits. EPA intends to have each educator only complete the survey once during the period for which this ICR is in effect. An analysis of these results will give a snapshot of outdoor educators’ behaviors, and will help the Agency to better understand the challenges associated with promoting sun safety in outdoor education programs.
1(d) Feasibility
EPA has determined that an online, voluntary, self-selecting certification program is a feasible way to gather information regarding sun protection in an outdoor education setting. In addition, EPA has developed the online certification program, and has funding to collect and review responses and perform necessary updates to the program.
SECTION II – SURVEY DESIGN
2(a) Target Population and Coverage
The target population consists of outdoor recreation staff members that supervise teens and pre-teens. Because the certification program is self-selecting and voluntary, the coverage of this population is not known.
2(b) Sample Design
2(b)i Sampling Frame
The sampling frame is all U.S. outdoor recreation staff members at programs included in the mailing lists from the American Camp Association (ACA) and the National Recreation and Park Association (NRPA). The sampling frame will be identified by the purchase of these e-mail lists.
Survey participants will be recruited via a targeted email to all programs in the ACA and NRPA lists, which will direct participants to the EPA website. Advertisements for the certification program will also be placed on the EPA SunWise Web site and ACA and NRPA Web sites. In addition, participants will be recruited at conferences related to sun safety and education, and EPA may also reach out to sports coaching associations for participant recruiting.
2(b)ii Sample Size
Because the certification program is self-selecting and voluntary, it is not possible to pre-determine a specific sample size. However, EPA anticipates about 1,000 outdoor recreation staff members will complete the certification program.
2(b)iii Stratification Variables
None.
2(b)iv Sampling Method
Because participation in the certification program is voluntary, the sampling method is voluntary self-selection. Because the survey questions must be answered to advance through the certification program, all outdoor educators that elect to participate in the certification program will necessarily complete the survey. There is sampling bias associated with self-selection on both an individual and an organizational level as camps or programs may require all of their educators to take the certification program; however, because participation in the certification program is voluntary, self-selection is the only feasible sampling method available to the Agency.
2(c) Precision Requirements
2(c)i Precision Targets
Since the findings cannot be generalized to all outdoor educators due to the self-selecting nature of the survey, precision is not an issue.
2(c)ii Non-sampling Error
It is expected that the largest non-sampling error will be the result of non-response.
2(d) Questionnaire Design
The certification program questions were derived from those developed in a 2008 study by Glanz et al.a In Glanz et al. (2008), a group of investigators evaluated available questionnaire measures of sun exposure and protection in order to propose a core set of standardized survey items for a range of age groups (adults, adolescents aged 11 to 17, and children 10 years or younger). The investigators used these core questions in cognitive testing and found that they had good clarity and applicability for measuring sun exposure and sun protection behaviors across a broad range of populations. In addition, it was determined that these methods are appropriate for studies tracking morbidity and/or mortality and evaluating prevention program effects.
Based on this study, participants are asked a series of questions throughout the certification program about themselves, their sun protection-related behavior and their experience with the certification program. These questions are meant as a self-assessment tool and must be answered in order to complete the certification program. Participants are encouraged to think about how they can improve their current behavior while answering. The questions are incorporated into each of the five sections of the on-line certification program:
Introduction: provides a brief overview of the certification program.
UV Basics: emphasizes the importance of knowing the facts about UV radiation.
Louder than Words: describes sun safe actions and provides information on sun protection attire and sunscreen.
Do As I Do: provides insight to participant’s ability to influence young people, and offers suggestions for modeling sun safe behaviors.
Before You Go: summarizes information from the certification program, provides links to additional resources, and presents questions about participant’s intentions to follow suggestions presented in the certification program.
The burden to answer the survey questions included in the certification program is estimated at 7 minutes.
SECTION III – PRETESTS AND PILOT TESTS
No pilot testing of the certification program questions is planned.
SECTION IV – COLLECTION METHODS AND FOLLOW-UP
4(a) Collection Methods
Certification program participants will complete the certification program online through the EPA SunWise Program website, and responses will be automatically submitted electronically upon completion. All certification program participants are anonymous, and no personal information will be stored.
4(b) Survey Response and Follow-Up
Because outdoor educators self-select to complete the certification program, no response rate will be specifically measured, although EPA will keep count of the number of respondents. No follow-up will be performed with outdoor educators that complete the certification program.
SECTION V – ANALYZING AND REPORTING SURVEY RESULTS
5(a) Data Preparation
No personal information will be stored. Responses to questions presented in the certification program will be collected and stored automatically upon completion, and will be reviewed by EPA to help determine necessary changes to make to the certification program and program.
5(b) Analysis
Data obtained through the certification program will help EPA gain a better understanding of adjustments to make to the SunWise certification program to better educate outdoor recreation staff. In addition, responses will provide EPA with an idea of the types of sun safety policies implemented in outdoor education programs, policy effectiveness, as well as possible areas of improvement to increase sun protection.
Basic descriptive statistical analyses will be used to describe and present a basic summary of responses collected and organize them in a logical manner. Data will be aggregated to calculate, for example, the percentage of outdoor educators participating in the survey that practice a certain sun safety behavior, or the mean number of times outdoor educators remind teens and pre-teens in their care about sun protection, along with 95% confidence intervals. Data will also be parsed for comparison by different demographic or other categorical variables in order to compare, for example, the percentages of male versus female outdoor educators that remind children to protect themselves from the sun. Differences in percentages will be calculated together with 95 percent confidence intervals and statistical tests to determine whether the differences are statistically significant.
5(c) Reporting Results
Information from certification program responses will serve to inform EPA of program areas needing improvement to better reach the target audience, as well as increase sun safety awareness and practices in informal and outdoor education settings.
Raw survey data will be kept confidential. EPA plans to share aggregated findings with partners and the public to improve sun safety education.
a Glanz, K, Yaroch, AL, Dancel, M, Saraiya, M, Crane, LA, Buller, DB, Manne, S, O’Riordan, DL, Heckman, CJ, Hay, J, Robinson, JK. Measures of Sun Exposure and Sun Protection Practices for Behavioral and Epidemiologic Research. Archives of Dermatology. 2008; 144 (2); 217-222.
File Type | application/msword |
Author | ICF |
Last Modified By | ctsuser |
File Modified | 2009-07-17 |
File Created | 2009-07-17 |