1904ss08

1904ss08.docx

The Sun Wise School Program

OMB: 2060-0439

Document [docx]
Download: docx | pdf

April 2014




The SunWise Program

ICR # 1904.08




















U. S. Environmental Protection Agency

Office of Air and Radiation

Part A of the Supporting Statement


  1. Identification of the Information Collection


1(a) Title of Information Request


The title of this Information Collection Request (ICR) is The SunWise Program (ICR# 1904.08; OMB Control Number 2060-0439).


1(b) Short Characterization/Abstract


The SunWise Program was initiated in 1998 through a statutory mandate under Title VI of the Clean Air Act. The long-term objective of the SunWise Program is to reduce the incidence of, and morbidity and mortality from skin cancer, cataracts, and other UV-related health effects in the United States. Short term objectives include: 1) reducing the risk of childhood overexposure to the sun by changing the knowledge, attitudes, and behaviors of elementary school children and their care givers; and 2) improving the availability of accurate, timely, and useful UV data directly to schools and communities across the United States.


The SunWise Program builds on traditional health education practices through the use of existing curricula, learning standards, and evaluation mechanisms. The Program is a collaborative effort of schools, communities, health professionals, educators, environmental organizations, meteorologists, local governments, and federal agencies.


Participating schools sponsor classroom and school-wide activities to raise children’s awareness of stratospheric ozone depletion, UV radiation, the largely preventable health risks from overexposure to the sun, and simple sun safety practices. Educators interested in participating are asked to register using either the online form (www.epa.gov/sunwise/becoming.html and www.epa.gov/sunwise/becoming_partner.html) or a hard copy version distributed by EPA. EPA uses the information to maintain a database of participating schools and organizations and a mailing list for information distribution. Participating schools and organizations receive a classroom “Tool Kit” of games, songs, puzzles, storybooks, videos, and access to online UV intensity mapping/graphing tools. The Tool Kit also includes sample sun safety policies and guidelines to help expand the sun safety message school-wide.


Teachers are asked to complete a survey at the end of program implementation. Results are used to fine-tune existing SunWise materials and develop new ones. Teachers are also asked if they are interested in administering a brief survey to their students before and after program implementation. The surveys are available on the SunWise website for interested teachers. Student survey responses are anonymous. The results from the student surveys are used to gauge program effectiveness and guide materials development.


SunWise is also seeking qualitative information regarding barriers to the promotion of SunWise and adoption of school policies, as well as teachers’ receptiveness to a new recognition program SunWise is considering developing. The program would recognize levels of SunWise teaching for interested educators. The more SunWise teaching and sun safety policy changes implemented by an educator, the higher her/his level of SunWise (similar to a bronze, silver and gold level) recognition. This recognition/incentives program would allow EPA to get a better grasp on the level of engagement with SunWise, and would allow for future effectiveness studies comparing impacts on student knowledge, attitude and behavior with an educator’s level of engagement. To gather qualitative information on these issues, the SunWise Program will conduct voluntary, individual, semi-structured telephone interviews with willing educators.


In addition, EPA has teamed with other members of the National Council on Skin Cancer Prevention, which include the American Cancer Society, the American Academy of Pediatrics, and the Centers for Disease Control and Prevention, to support a national day of sun safety, the Don’t Fry Day (DFD) campaign. As part of this campaign, educators are asked to pledge to incorporate sun safety into their spring and summer activities. The DFD pledge is available at: http://www.epa.gov/sunwise/dfdpledge.html.


Further, SunWise has developed an online interactive SunWise Sun Safety Certification Program that enables students, adults, organizations, and employers to develop credentials on sun safety awareness and behaviors. In order to gauge the certification program’s effectiveness, EPA will collect information on demographics, knowledge, attitudes, intended behavior, and behavior of the tutorial users. User types include: outdoor recreation staff at camps, parks, recreation programs, sports organizations, lifeguards, etc. The certification program is online at: http://www.epa.gov/sunwise/tutorial.html


Finally, EPA will pretest a survey for SunWise non-school partners. While schools are the primary programmatic component of SunWise, SunWise is also promoted through registered 501(c)(3) organizations such as science centers and camps, children's museums, and scouting groups, as well as other not-for-profit organizations like local, county and state health, recreation and education departments. The partner survey will aim to understand how the SunWise Program is being implemented by non-school partners, and how it can be improved.


PREVIOUS TERMS OF CLEARANCE:


On June 23, 2011, OMB approved ICR #1904.08 with Terms of Clearance that the survey results are not generalizable to the larger population.


2. Need for Use of the Collection


2(a) Need/Authority For The Collection


This collection will be used for both program material distribution and for determining program effectiveness and participant satisfaction.


Educators will be asked to fill out a registration form which we use to mail out the program materials and keep track of the Program’s:


  • geographic reach (Which states/regions have SunWise schools?);

  • grade-level and subject-matter distribution (How many 1st grade teachers are using SunWise? How many science teachers are using SunWise? etc.); and

  • student participation (How many students is SunWise reaching?).


The surveys to be administered include:


  • Teacher online survey for measuring their receptivity to the educational component of the Program and experience with the SunWise Tool Kit and educational resources;

  • Student survey to identify sun safety knowledge, attitudes, and behaviors among students before and after participation in the Program;

  • One-on-one semi-structured telephone interviews with teachers to complement the information collected through the self-reported Teacher Survey instrument; related to demographic and mailing information and intent for incorporating sun safety activities into spring and summer teaching;

  • Embedded questions within the SunWise Sun Safety Certification Program measure receptivity to sun protection, demographic information, and current practices, attitudes and knowledge; and

  • Pretest of the partner survey for developing a survey to measure and understand partners’ receptivity to and use of the SunWise Tool Kit and other educational resources.

  • Teacher online survey for measuring their receptivity to the educational component of the Program and experience with the SunWise Tool Kit and educational resources;

  • Student survey to identify sun safety knowledge, attitudes, and behaviors among students before and after participation in the Program;

  • One-on-one semi-structured telephone interviews with teachers to complement the information collected through the self-reported Teacher Survey instrument; related to demographic and mailing information and intent for incorporating sun safety activities into spring and summer teaching;

  • Embedded questions within the SunWise Sun Safety Certification Program measure receptivity to sun protection, demographic information, and current practices, attitudes and knowledge; and

  • Pretest of the partner survey for developing a survey to measure and understand partners’ receptivity to and use of the SunWise Tool Kit and other educational resources.


The data will be analyzed and results will indicate the Program’s effect on participants’ sun safety knowledge, attitudes, and behaviors. Additionally it will help SunWise understand how the Program is being implemented and how it can be improved. Responses are voluntary. Information remains confidential, and responses to the student survey are anonymous.


The SunWise Program recognizes the challenge of measuring the progress and evaluating the effectiveness of an environmental and public health education program where the goal is reducing risk and improving public health. Therefore, continual evaluation of program effectiveness through data from surveys, tracking and monitoring of classroom activities and school policies, and experts, is necessary to monitor progress and refine the program.


2(b) Practical Utility/Users of the Data


EPA/SunWise will use the registration forms to track program reach and effectiveness.


The survey results will provide insight into the program’s messages, approaches, and materials. Survey results will also enable SunWise to better meet the needs of its educator and student participants, with the long range goal of reducing the incidence and effects of skin cancer and other UV-related health problems among children and adults.


3. Nonduplication, Consultations, and Other Collection Criteria

3(a) Nonduplication


The SunWise Program information that is collected is not duplicative of information otherwise available to EPA.


In addition, there is no other sun safety certification available to outdoor recreation staff. Therefore it is essential that information on the users and use of the certification program be collected for program refinement.


Conducting timely process evaluation is also of importance if the SunWise Program is to continue providing high-grade and pertinent resources for educators.


3(b) Public Notice Required Prior to ICR Submission to OMB


Official notice of this proposed collection appeared in the Federal Register on 12/31/13. No comments were received.


3(c) Consultations


In April 2014, EPA consulted nine teachers with past experience implementing the SunWise program. The teachers were of the unanimous opinion that our burden hour estimates correspond to their experiences with SunWise.


3 (d) Effects of Less Frequent Collection


SunWise depends on registration information to:

  • maintain an accurate list of participants; and

  • ensure timely distribution of program materials and program updates

SunWise depends on survey responses to:


  • guide program development;

  • measure participant satisfaction;

  • test new ideas for recognition and incentives; and

  • elicit information on attitudes and practices of children and their caregivers relating to sun exposure.


SunWise depends on certification program information to:


  • determine the current knowledge, attitudes, and behaviors as well as behavioral intentions;

  • determine which tutorial to provide;

  • measure how many and what type of users are becoming certified;

  • ensure the tutorial does not take too long to complete; and

  • determine whether the tutorial is delivering information in an easy-to-understand manner.


Conducting the surveys and information collection less frequently will slow down the Program’s ability to institute participants’ desired changes.


3(e) General Guidelines


All OMB guidelines will be adhered to by EPA/SunWise Program.


3(f) Confidentiality

Names of participating schools and organizations may be made public. Names of registered educators and other participating individuals will remain confidential. Responses to the collection of survey information will remain confidential. Student surveys are also completed on an anonymous basis. Certification program users will be asked to provide their first and last name so they can be given a certificate of completion with their name on it; however, the names will not be collected.


3(g) Sensitive Questions


The survey instruments contain no sensitive questions.


4. The Respondents and the Information Requested


4(a) Respondent/SIC and NAICS Codes


Entities affected by this action are elementary, middle, and high school students and educators (SIC Div. I: Group 8211, NAICS code: 61111), as well as recreation workers (NAICS code: 813400), health educators (NAICS code: 999300), and preschool teachers (NAICS: 624400).

4(b) Information Requested


The registration form (Attachment 1a and 1b, also available at www.epa.gov/sunwise/becoming.html and www.epa.gov/sunwise/becoming_partner.html) is a 10-minute questionnaire that asks teachers to provide: the name and address of the school; school grade levels; and what information is of interest. The form ensures that EPA distributes relevant education materials.


The survey instruments covered under this ICR are:


  • Teacher Survey (Attachment 2): Educators will be asked to evaluate their own and their students’ receptivity to sun safety resources provided by EPA. Additionally, educators will be asked about how they implemented the SunWise program in their school/classroom and how many students they reached. Finally, educators will be asked about areas for program growth, including their receptivity to new resources and an updated recognition program. Educator feedback about the usefulness of classroom and school materials will be vital to the refinement of program materials. Part B(i) of the Supporting Statement provides additional information on the teacher survey design.

  • Student Survey (Attachment 3a and 3b): This survey will be administered by teachers to students before and after implementation of SunWise activities. Pre-test and post-test surveys are similar, with the exception of one question in the post-test aimed at verifying that the student has participated in SunWise. The 10-minute questionnaire elicits basic information on knowledge, attitudes, and practices of children relating to sun exposure. The survey is identical to that previously approved by OMB (Control No. 2060-0439). Part B(i) of the Supporting Statement provides information on the student survey design.

  • Teacher Telephone Interviews (Attachment 4): To complement the information collected through the self-reported Teacher Survey instrument, some teachers will be asked to participate in one-on-one, semi-structured telephone interviews to provide qualitative information regarding barriers to the promotion of SunWise and adoption of school policies, as well as teachers’ receptiveness to a new recognition program that SunWise is considering developing. An interview guide with topics for discussion is provided in Attachment 4. Part B(ii) of the Supporting Statement provides additional information on the teacher survey design.

  • SunWise Don’t Fry Day pledge: Embedded questions in the pledge collect demographic information and intent to incorporate sun safety activities into spring and summer teaching. This information can be submitted online, and the pledge is available at: http://www.epa.gov/sunwise/dfdpledge.html

  • SunWise Sun Safety Certification Tutorial Questions: Certification program users will be asked to provide their first and last name so they can be given a certificate of completion. Names will not be collected by EPA. Additionally, users will be asked questions to determine their current sun protection knowledge, attitudes and behaviors. The questions will also help educate the user by reminding them of their own behavior in comparison to the desired behavior (practicing sun safety). Part B(iii) of the Supporting Statement provides information on the certification program survey design.

  • Pretest of the Partner Survey (Attachment 5): Selected partners will be asked to pretest a survey to measure partners’ receptivity to the SunWise Tool Kit and other educational resources. Part B(iv) of the Supporting Statement provides information on the pretest partner survey design.


Registration forms can be submitted electronically or in hard copy form using envelopes provided by EPA. The teacher survey is available electronically. Teachers will be given the option to return student surveys either by email, fax, or postage-paid envelopes provided by EPA. Neither the registration nor the surveys require that respondents keep records or maintain files.

5. The Information Collected


5(a) Agency Activities

The Agency activities associated with registration of participants consist of the following:


  • Maintain participant database; and

  • Maintain mailing list for information distribution


The Agency activities associated with surveying done through the SunWise Program consist of the following:


  • Develop collection instruments;

  • Answer respondent questions;

  • Conduct individual telephone interviews with teachers;

  • Review data submissions;

  • Reformat the data;

  • Analyze the data and make program adjustments as needed;

  • Store the data.


The Agency activities associated with the certification program done through the SunWise Program consists of the following:


  • Store and consolidate data, none of which is sensitive or personal; and

  • Review consolidated data and make adjustments as needed.


5(b) Collection Methodology and Management


In collecting and analyzing the information associated with this ICR, EPA will use electronic and hard-copy registration forms, electronic and hard-copy surveys, and telephone interviews.


Further details on the collection methodology and management for the surveying done through the SunWise Program are provided below.


Registration


EPA routinely promotes the SunWise Program through presentations and exhibits at meetings of nurses, teachers, and other educators. Registrants provide their name and contact information, including the name of their school, and state whether they are a classroom teacher, health teacher, gym teacher, or school nurse on paper copy registration forms. This information is entered into a registration and tracking system housed on EPA servers. In addition to the paper registrations, EPA registers educators through an online registration page on EPA’s SunWise program website. The system is registered with the Automated System Security Evaluation and Remediation Tracking (ASSERT) program to meet reporting requirements under the Federal Information Security Management Act (FISMA).


The data is used to send registrants SunWise resources and alert registrants of sun safety-related opportunities and new resources. No personally identifiable information is shared outside of EPA and its contractors and grantees.


Teacher and Student Surveys


Teacher Surveys are conducted to determine:


  • Students’ satisfaction with SunWise activities and resources;

  • Teachers’ satisfaction with SunWise activities and resources;

  • How and how often teachers are using the SunWise materials, resources and programming;

  • How many students are receiving SunWise education;

  • If teachers are sharing resources with other teachers;

  • If school policies are being changed as a result of SunWise;

  • If teachers are changing their own behavior;

  • If students are changing their behavior;

  • If teachers have suggestions for improving or creating new SunWise resources.


EPA will send a recruitment email in the Spring/Summer each year encouraging all registered participants to take the SunWise Teacher Survey (Attachment 2) hosted online. Participants may also be recruited through additional avenues, such as recruitment letters distributed through the SunWise Tool Kit, educator conferences, or direct mailings. Since this survey will be voluntary and self-selecting, it will not be generalizable to the entire pool of registered SunWise teachers. However, it will be informative and provide insight into how some teachers utilize SunWise materials and how EPA can encourage higher use rates.


Part of the Teacher Survey will be an optional student pre-test and post-test survey using the validated student survey used in the previous ICR period (Attachment 3a and 3b). There will be no control group for this portion of the survey, and no generalizations will be made from this data. It will serve as a useful way to see if students are still getting the same benefit from the SunWise Tool Kit as in quasi-experimental study designs testing the same concepts (see previous ICR supporting statement for more details). Again, because this portion of the survey will be voluntary and self-selecting, the results will not be generalizable to every student that has received a SunWise education.


For teachers that choose to participate, they will provide children with a double-sided, one-page anonymous survey instrument. After students complete the pre-test in the spring, teachers will lead the SunWise lessons. SunWise recommends that participating teachers administer the post-test survey at least one month after teaching the SunWise lessons, and will ask teachers to report what the time gap was between SunWise lessons and administration of the survey.


All student surveys are anonymous.


Teachers will be instructed to return completed student surveys to EPA by scanning and emailing the surveys, faxing the surveys, or by requesting a self-addressed, stamped envelope from EPA.


EPA will ensure the accuracy and completeness of collected information by having all surveys reviewed by a contractor, grantee, or EPA staff. The data collected from the surveys will serve to provide information internally to help improve the SunWise Program. Since the results are not intended to be generalizable, no statistical approach is needed.


Part B(i) of the Supporting Statement presents more detailed information on the data collection, management, and analysis methods for the teacher and student surveys.


Teacher Individual Interviews


To supplement the self-reported Teacher Survey instrument, individual interviews will be conducted with selected teachers to gather richer, qualitative information regarding:


  • Teachers’ involvement in the SunWise Program;

  • If school policies are being changed as a result of SunWise;

  • Which SunWise activities and resources are most effective;

  • How teachers’ approach to teaching SunWise activities has changed over time;

  • How the SunWise Program can more effectively disseminate its materials and recruit more teachers;

  • How the SunWise Program can encourage teachers to increase their involvement in SunWise and their promotion of sun safety in schools; and

  • Receptiveness to an incentives or “Levels of SunWise” educator recognition program, and ideas for making such a program successful.


Interview participants will be recruited via a screening email to all registered SunWise educators asking (a) whether they have taught SunWise in the past two years; (b) how many years they have been teaching SunWise; and (c) whether they are willing to both complete an online teacher survey and participate in a one-on-one telephone interview. Teachers responding positively to both screening questions (a) and (c) will be grouped by region and length of participation, and across these groups, 50 teachers will be randomly selected to participate in the interview process.


To the extent possible, the selected teachers will represent the geographical and participation range of SunWise, though the sample will not necessarily be representative in a statistical sense. The teachers that are not selected will still be encouraged to take the online survey, but will not be part of the group that will be individually interviewed. After the selected participants have taken the online survey, EPA will set up a convenient time to interview each of the selected teachers.


Teachers will participate in one online survey and one telephone interview per year over a three year period, with slightly different informational goals for each year. In the first year, the interview will include discussion about the development of an educator recognition or incentives program, while interviews in later years may focus on other areas of interest, such as parental involvement.


Since participation is both voluntary and self-selecting, the results of this qualitative study will not be generalizable to the entire pool of registered SunWise teachers. The data collected from the interviews will serve only to provide information internally to help improve the SunWise Program and the development of a new recognition program.


Part B(ii) of the Supporting Statement presents detailed information on the data collection, management, and analysis methods for the teacher interviews.


SunWise Don’t Fry Day Pledge


EPA will collect information as educators complete the pledge online. Prior to Don’t Fry Day each year (the Friday before Memorial Day), EPA will review the information collected and summarize participation for promotional efforts. Additionally, EPA will mail a poster and stickers to all educators taking the pledge. The pledge is available at: http://www.epa.gov/sunwise/dfdpledge.html


Certification Tutorial


EPA will collect information as participants take the tutorial. Many of the questions will help instill the information they are learning through the tutorial. Part B(iii) of the Supporting Statement presents more detailed information on the data collection, management, and analysis methods for the tutorial/certification program.


EPA plans to periodically review data collected from the certification program/tutorial and make refinements to the program as necessary. The knowledge gained through this information collection will inform programmatic decisions and allow EPA to gain a better understanding of the target audience (to determine if additional intervention is needed in the outdoor recreation setting). While results cannot be generalized to the general outdoor recreation staff population due to the limitation of self-selection, the information will be informative and will be shared with partners and the public for improved tailoring of interventions to the outdoor recreation audience.


Pretest Partner Survey


EPA will undertake pretesting of a survey for non-school partners participating in the SunWise Program. These partners may include state and local health departments, childcare centers, museums, camps, and science centers. The purpose of the survey will be to better understand how non-school partners are interacting with the SunWise Program, as well as to determine:


  • How and how often partners are using the SunWise materials, resources and programming;

  • How many children are receiving SunWise education through non-school partners;

  • Children’s satisfaction with SunWise activities and resources;

  • Partners’ satisfaction with SunWise activities and resources;

  • If partners are sharing resources with other partners;

  • If partner organizations’ sun safety policies are being changed as a result of SunWise;

  • If partners have suggestions for improving or creating new SunWise resources.


The pretesting is intended to determine the validity and effectiveness of the survey questions—e.g., whether questions measure what they are supposed to measure, whether partners understand the questions, and whether the questions are the right questions for gaining an understanding of how partners are interacting with the SunWise Program.


EPA will send a recruitment email in the Spring/Summer encouraging registered partners to participate in the pretesting of the partner survey (Attachment 5). Participants may also be recruited through additional avenues, such as recruitment letters distributed through the SunWise Tool Kit, educator conferences, or direct mailings. For partners willing to participate, EPA will sort the partners into types (e.g., health departments, childcare centers, camps, and educational centers such as museums or science centers) and randomly select participants from each group, for a total of 30 participants.


Depending on available resources and other constraints, the survey may be self-administered with feedback gathered from each participant over the telephone, or the survey may be administered in-person either in an individual or group setting, with feedback gathered through in-person interviews. In either case, participants will be asked if they understood all questions and whether there were questions they would suggest removing or adding to better reflect the participation of partners in the Program. Based on this feedback, EPA will revise the partner survey.


Part B(iv) of the Supporting Statement presents detailed information on the data collection, management, and analysis methods for pretesting the partner survey.

5(c) Small Entity Flexibility


Not applicable.


5(d) Collection Schedule


Registration: All teachers are required to register for the Program to receive the SunWise Tool Kit (www.epa.gov/sunwise/becoming.html and www.epa.gov/sunwise/becoming_partner.html) and regular program updates.


Teacher and Student Surveys: All program participants are invited to take the Teacher Survey at any time during the year. As noted above, recruitment emails will be sent in the Spring/Summer to all registrants encouraging them to take the Teacher Survey, but the survey is optional.


Teachers opting for their classrooms to participate in the student pre-test and post-test surveys will be asked to administer the pre-tests before teaching the SunWise lessons, and then to administer the post-test survey at least one month after teaching the SunWise lessons.


Teacher Interviews: Participants (as selected using the method described in Part B(ii) of the Supporting Statement will complete the online survey and one telephone interview per year over the three year period.


Don’t Fry Day Pledge: All educators are invited to take the DFD pledge at any time during the year. A recruitment email will be sent in the Spring to all registrants encouraging them to take the pledge, but participation is optional. Participants may also be recruited educator conferences.


Certification Tutorial: All outdoor educators are invited to take the certification tutorial at any time during the year.


Partner Survey Pretesting: The partner survey will be pretested once in the three-year ICR period.


6. Estimating the Burden and Cost of the Collection


6(a) Estimating Respondent Burden


Registration: EPA developed the SunWise Program Registration Form with the Agency’s Internet Support Team in Research Triangle Park, North Carolina. Input from a five-person focus group was used to determine average completion time. Teachers are asked to complete the registration form only once during their participation in the program for a total registrant burden of 10 minutes.


Annual estimated respondent burden:


Annual Respondent Burden- Registration

Registrant Group

Hour Burden

Educator

0.17



Teacher and Student Surveys: During the development of the teacher survey, EPA, in consultation with a contractor and fewer than nine educators, reviewed the teacher survey for content and completion time. The teacher survey is administered one time each year and takes approximately 20 minutes to complete. If the teacher decides to conduct the student pre- and post-test surveys, additional burden will be incurred.


During the development phase of the student surveys, EPA, in consultation with a contractor, pretested the survey with 9 children to determine appropriate content and survey completion time. The student survey will be administered once in years 1 and 3 (i.e., pre-test for Group A and post-test for Group B) and twice in year 2 (i.e., post-test from Group A and pre-test for Group B). Each survey will take approximately 10 minutes to complete, for an annual per student burden of 10 minutes.


Annual estimated respondent burden:


Annual Respondent Burden- Surveys

Survey Group

Hour Burden

Student

0.17

Educator – No student survey

0.33

Educator – Yes student survey

0.5


Teacher Interviews: Some selected teachers will both complete the online teacher survey (estimated at 20 minutes, as discussed above), as well as participate in a 30 minute interview with EPA and/or a contractor, for a total annual per teacher burden of 50 minutes.


Annual estimated respondent burden:


Annual Respondent Burden- Interviews/Surveys

Survey Group

Hour Burden

Educator

0.83



Don’t Fry Day Pledge: Educators will be invited to take the Don’t Fry Day pledge at any time throughout the year. The pledge can be completed online and requires participants to fill out their name, address, school, and commitment to sun protection. The total respondent burden is estimated to be 5 minutes per year.


Annual estimated respondent burden:


Annual Respondent Burden- Pledge

Survey Group

Hour Burden

Educator

0.08



Certification Tutorial: Users will be asked a series of questions about sun protection to determine their demographics (no personal identifiers will be captured), knowledge, attitudes, behavior, perception of others they work with, and the environmental conditions in the place where they work. They will also be asked to enter their first and last names for a printable certificate. This information will not be saved by EPA. The total respondent burden is estimated to be 7 minutes.


Annual estimated respondent burden:


Annual Respondent Burden –Tutorial/Certificate Questions

Survey Group

Hour Burden

Student

0.12

Outdoor Educator

0.12



Partner Survey Pretesting: To pretest a new survey for SunWise non-school partners, some selected partners will complete the survey (estimated at 20 minutes, based on the survey’s similarity to the teacher survey described above), as well as participate in a 30 minute interview with EPA and/or a contractor to discuss the survey and ways to improve it. The total respondent burden is thus estimated at 50 minutes.

Annual estimated respondent burden:


Annual Respondent Burden- Partner Survey

Survey Group

Hour Burden

Non-school Partner

0.83


6(b) Estimating Respondents Costs


The Bureau of Labor Statistics figures1 were used to determine labor costs for these tables. In order to account for benefits and overhead, the average hourly wage rate of $38.82 for a teacher and $12.22 for a recreation worker (i.e., outdoor educator) were increased by 110% for a labor cost of $81.52 per hour for teachers and $25.66 per hour for outdoor educators.


For partners, the average of the average hourly wage of $25.53 for a health educator and $14.79 for a preschool teacher was calculated to determine the hourly labor cost, since these occupations are considered typical of SunWise’s non-school partners. This averaged hourly labor cost of $20.16 was increased by 110% for a labor cost of $42.34 per hour for partners to account for benefits and overhead.


Annual Respondent Burden and Cost- Registration

Registrant Group

# of responses per participant

Hour Burden

Labor Cost

Educator

1

0.17

1 * 0.17 * $81.52 = $13.86



Annual Respondent Burden and Cost Teacher and Student Surveys

Survey Group

# of responses per participant

Hour Burden

Labor Cost

Student

1

0.17

1 * 0.17 * $0 = 0

Educator – No student survey

1

0.33

1 * 0.33 * $81.52 = $27.17

Educator – Yes student survey

1

0.5

1 * 0.5 * $81.52 = $40.76



Annual Respondent Burden and Cost Teacher Interviews/Surveys

Survey Group

# of responses per participant

Hour Burden

Labor Cost

Educator

1

0.83

1 * 0.83 * $81.52 = $67.94



Annual Respondent Burden and Cost Don’t Fry Day Pledge

Survey Group

# of responses per participant

Hour Burden

Labor Cost

Educator

1

0.08

1 * 0.08 * $81.52 = $6.79



Annual Respondent Burden and Cost Tutorial/Certificate Questions

Survey Group

# of responses per participant

Hour Burden

Labor Cost

Student

1

0.12

1 * (0.12 * 0) = 0

Outdoor Educator

1

0.12

1 * 0.12 * $25.66 = $3.08



Annual Respondent Burden and Cost Partner Survey Pretesting

Survey Group

# of responses per participant

Hour Burden

Labor Cost

Non-school Partner

1

0.83

1 * 0.83 * $42.34 = $35.28


The respondents will have no capital/startup or O&M costs.


6(c) Estimating Agency Burden and Cost


Registration: Registration information collection is done primarily through a website database feature. The start-up cost associated with designing the registration web page was approximately $25,000, but that money has already been spent prior to previous ICRs. Maintenance of the website is estimated to involve three types of staff: EPA personnel, Grantees through the Senior Environmental Employee (SEE) Program, and contractor staff costing $130 per hour. The EPA employees will take 4 hours/month or 48 hours per year. The cost of this labor is calculated based on a GS 13 Step 1 pay level living in Washington, DC ($68.26/hour using the salary associated with this grade and step, multiplied by a benefits factor of 1.6), making the total annual cost $3,276.29. The contractor will spend 240 hours per year on the maintenance and enhancement of the registration and tracking system at an annual cost of $31,200.


Finally, EPA will manually input all information received via hard-copy registration form onto the database. The costs of this labor are estimated to be 2000 hours per year at a SEE Program pay level of $40,000 annually.


Agency Burden and Costs – Registration


Burden Hours

Total Costs ($)

EPA (Annual)

2,288

$74,476.29

EPA (3-Year ICR)

6,864

$223,428.86


Teacher and Student Surveys: The contractor assists EPA in data collection and analysis. The contractor also provided technical support in the development of the surveys. To perform these functions, EPA will contract for a total of 150 professional hours per year. At an average rate of $130.00 per hour, the total cost for the contractor is about $19,500 annually. Agency burden to manage this contract is estimated at 4 hours/month or 48 hours annually. The cost of this labor will be calculated based on a GS 13 Step 1 pay level ($68.26/hour using the salary associated with this grade and step, multiplied by a benefits factor of 1.6). Total hours (48) amounts to a total agency labor cost of $3,276.29/per annum.


Agency Burden and Costs- Teacher and Student Surveying


Burden Hours

Total Costs ($)

EPA (Annual)

198

$22,776.29

EPA (3-Year ICR)

594

$68,328.86

Teacher Interviews: The contractor assists EPA in data collection and analysis. The contractor also provides technical support in the development and deployment of the surveys and interview questions. To perform these functions, EPA will contract for a total of 150 professional hours per year. At an average rate of $130.00 per hour, the total cost for the contractor is about $19,500 annually. Agency burden to manage this contract is estimated at 4 hours/month or 48 hours annually. Agency burden associated with the coordination and participation in interviews is estimated at 50 hours per year. The cost of this labor will be calculated based on a GS 13 Step 1 pay level ($68.26/hour using the salary associated with this grade and step, multiplied by a benefits factor of 1.6). Total hours (98) amounts to a total agency labor cost of $6,689.09/per annum.


Agency Burden and Costs- Teacher Interviewing/Surveying


Burden Hours

Total Costs ($)

EPA (Annual)

248

$26,189.09

EPA (3-Year ICR)

744

$78,567.26


Don’t Fry Day Pledge: To perform the data collection and analysis function, agency burden is estimated at 2 hours/month or 24 hours annually. The cost of this labor will be calculated based on a GS 13 Step 1 pay level ($68.26/hour using the salary associated with this grade and step, multiplied by a benefits factor of 1.6). Total hours (24) amounts to a total agency labor cost of $1,638/per annum.


Agency Burden and Costs- Don’t Fry Day Pledge


Burden Hours

Total Costs ($)

EPA (Annual)

24

$1,638.14

EPA (3-Year ICR)

72

$4,914.43


Certification Tutorial: The contractor will maintain the tutorial, including the data collection component.  The contractor will also analyze the data every other year (i.e., during year 1 and year 3 of the ICR). To perform this task, EPA has contracted for a total of 200 professional hours, 100 hours for each year of analysis.  In addition, EPA has contracted a total of 12 hours each year for maintenance. At an average rate of $130.00 per hour, the total cost for the contractor is $13,000 per year for data collection in year 1 and 3, and $1,560 per year for maintenance.  Agency burden to manage this contract is estimated at 4 hours/month or 48 hours annually.  The cost of this labor will be calculated based on a GS 13 Step 1 pay level ($68.26/hour using the salary associated with this grade and step, multiplied by a benefits factor of 1.6). 


                Agency Burden and Costs – Tutorial/Certification


Burden Hours

Total Costs ($)

EPA (Annual) – Year 1 and 3

160

$17,836.29

EPA (Annual) - Years 2

60

$4,836.29

EPA (3-Year ICR)

380

$40,508.86


Partner Survey Pretesting: The contractor will assist EPA in the development and deployment of the partner surveys and interviews in the pretesting process. To perform these functions, EPA will contract for a total of 90 professional hours in year 3. At an average rate of $130.00 per hour, the total cost for the contractor is about $11,700. Agency burden to manage this contract is estimated at 2 hours/month or 24 hours in year 3. Agency burden associated with the coordination and participation in pretesting interviews is estimated at 30 hours in year 3. The cost of this labor will be calculated based on a GS 13 Step 1 pay level ($68.26/hour using the salary associated with this grade and step, multiplied by a benefits factor of 1.6). Total hours (54) amounts to a total agency labor cost of $3,685.82 in year 3.


Agency Burden and Costs- Partner Survey


Burden Hours

Total Costs ($)

EPA (Annual) – Year 3

144

$15,385.82

EPA (3-Year ICR)

144

$15,385.82

6(d) Estimating the Respondent Universe and Total Burden Costs


Registration


(A)

Number to register

(B)

Total Hours

(C)

Rate per hour ($)

(D)

# of responses

(E)

Total Cost

E=B*C

3,500 Educators

595

$81.52

3,500

$48,505.59

Total (Annual)

595

 

3,500

$48,505.59

ICR Total (3 years)

1,785

 

10,500

$145,516.77



Student and Teacher Surveys


(A)

Number to be surveyed

(B)

Total Hours

(C)

Rate per hour ($)

(D)

# of responses

(E)

Total Cost

E=B*C

1,000 Students per year

170

$0.00

1,000

$0.00

1,000 Educators – No student survey

333

$81.52

1,000

$27,174.00

300 Educators – Yes student survey

150

$81.52

300

$12,228.30

Average Total (Annual)

653

 

2,300

$39,402.30


ICR Total (3 years)

1,960

 

6,900

$118,206.90


Teacher Interviews and Surveys


(A)

Number to register

(B)

Total Hours

(C)

Rate per hour ($)

(D)

# of responses

(E)

Total Cost

E=B*C

50 Educators

42

$81.52

50

$3,396.75

Total (Annual)

42

 

50

$3,396.75

ICR Total (3 years)

125

 

150

$10,190.25



Don’t Fry Day Pledge


(A)

Number to register

(B)

Total Hours

(C)

Rate per hour ($)

(D)

# of responses

(E)

Total Cost

E=B*C

1,500 Educators

125

$81.52

1,500

$10,190.25

Total (Annual)

125

 

1,500

$10,190.25

ICR Total (3 years)

375

 

4,500

$30,570.75



Tutorial/Certificate


(A)

Number to be surveyed

(B)

Total Hours

(C)

Rate per hour ($)

(D)

# of responses

(E)

Total Cost

E=B*C

100 Students per year

12

$0.00

100

$0.00

1,500 Outdoor Educators

180

$25.66

1,500

$4,619.16

Average Total (Annual)

192

 

1,600

$4,619.16

ICR Total (3 years)

576

 

4,800

$13,857.48


Partner Surveys


(A)

Number to register

(B)

Total Hours

(C)

Rate per hour ($)

(D)

# of responses

(E)

Total Cost

E=B*C

30 Partners

25

$42.34

30

$1,058.40

Total (Annual) - Year 3

25

 

30

$1,058.40

ICR Total (3 years)

25

 

30

$1,058.40



Total


ICR Total-Registration + Surveys + Tutorial + Interviews + Pledge (average annual)*

1,632

 

8,980

$107,172.45

ICR Total- Registration + Surveys + Tutorial + Interviews + Pledge (3 years)

4,846

 

26,880

$319,400.55

*Represents average annual cost, however not all activities will occur during all three years of the ICR, as described above.


6(e) Bottom Line Burden Hours and Cost Tables


Bottom Line Burden and Costs (3-Year ICR)


Burden Hours

Total Costs ($)

Students

546

$0.00

Educators

3,735

$304,484.67

Outdoor Educators

540

$13,857.48

Non-school Partners

25

$1,058.40

EPA

8,798

$431,134.11

Subtotal (respondents)

4,846

$319,400.55

Subtotal (government)

8,798

$431,134.11

Total

13,644

$750,534.66



Bottom Line Burden and Costs (Average Annual)*


Burden Hours

Total Costs ($)

Students

182

$0.00

Educators

1,245

$101,494.89

Outdoor Educators

180

$4,619.16

Partners

25

$1,058.40

EPA

3,062

$158,301.92

Subtotal (respondents)

1,632

$107,142.45

Subtotal (government)

3,062

$158,301.92

Total

4,694

$265,474.37

*Represents average annual cost, however not all activities will occur during all three years of the ICR, as described above.


6(f) Reasons for Change in Burden


There is a small change in the average annual burden hours currently identified in the OMB Inventory of Approved ICR Burdens due to adjustments.


6(g) Burden Statement


The annual public reporting and record keeping burden for this collection of information is estimated to average 10 minutes per response for the registration, 10 minutes per response for the student survey, 20 minutes per response for the educator survey without the student survey, 30 minutes per response for the educator survey with the student survey, 50 minutes per response for the teacher interview with survey, 5 minutes per response for the Don’t Fry Day pledge, 7 minutes per response for the certification tutorial program, and 50 minutes per response for pretesting the partner survey. Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, disclose, or provide information to or for a Federal agency. This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information; processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The OMB control numbers for EPA’s regulations are listed in 40 CFR Part 9 and 48 CFR Chapter 15.


To comment on the Agency's need for this information, the accuracy of the provided burden estimates, and any suggested methods for minimizing respondent burden, including the use of automated collection techniques, EPA has established a public docket for this ICR under Docket ID Number EPA-HQ-OAR-2007-0069, which is available for online viewing at www.regulations.gov, or in person viewing at the Air and Radiation Docker in the EPA Docket Center (EPA/DC), EPA West, Room 3334, 1301 Constitution Avenue, NW, Washington, D.C. The EPA Docket Center Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal holidays. The telephone number for the Reading Room is (202) 566-1744, and the telephone number for the Air and Radiation Docket is (202) 566-1742. An electronic version of the public docket is available at www.regulations.gov. This site can be used to submit or view public comments, access the index listing of the contents of the public docket, and to access those documents in the public docket that are available electronically. When in the system, select “search,” then key in the Docket ID Number identified above. Also, you can send comments to the Office of Information and Regulatory Affairs, Office of Management and Budget, 725 17th Street, NW, Washington, D.C. 20503, Attention: Desk Officer for EPA. Please include the EPA Docket ID Number EPA-HQ-OAR-2007-0069 and OMB Control Number 2060-0439 in any correspondence.


Part B(i) of the Supporting Statement – Teacher and Student Surveys


SECTION I – SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER

PRELIMINARIES


1(a) Survey Objectives


EPA’s SunWise Program provides sun protection education via a standardized curriculum to school children in grades K-8 in public, parochial, and charter schools. More than 32,000 schools and 3,000,000 children have received SunWise education since the 1999-2000 school year. EPA proposes to conduct customer satisfaction and process-related evaluative surveys with the teachers using the program. The Teacher Survey and Individual Interview will determine:


  • Students’ satisfaction with SunWise activities and resources;

  • Teachers’ satisfaction with SunWise activities and resources;

  • How and how often teachers are using the SunWise materials, resources and programming;

  • How many students are receiving SunWise education;

  • If teachers are sharing resources with other teachers;

  • If school policies are being changed as a result of SunWise;

  • If teachers are changing their own behavior;

  • If students are changing their behavior;

  • If teachers have suggestions for improving or creating new SunWise resources.


The data will be analyzed, and results, although not generalizable, will indicate how the Program is being implemented and how it can improve.


The primary objective of the optional student surveys is to see if students are getting the same benefit from the SunWise Tool Kit as in quasi-experimental study designs testing the same concepts (see previous ICR supporting statement for more details). Because this portion of the survey will be voluntary and self-selecting, the results will not be generalizable to every student that has received SunWise education.


1(b) Key Variables


Satisfaction; frequency of use; number of students participating; number and types of activities taught; school policy change; student and teacher knowledge, attitudes and behavior; ways to improve the program.


1(c) Statistical Approach


The primary objective in conducting the SunWise Teacher Survey is to understand how the SunWise program is being implemented and how it can be improved. Since the results are not intended to be generalizable to the complete pool of SunWise teachers, no statistical approach is needed.


The student surveys will serve as a way to see if students are getting the same benefit from the SunWise Tool Kit as in previous years. Because these surveys are optional and self-selecting, and the results are not intended to be generalizable, no statistical approach is needed.


1(d) Feasibility


EPA has reviewed the administrative procedures necessary to conduct the SunWise teacher and student surveys and has determined that it is feasible to continue with the surveys. The Teacher Survey was reviewed by educators and survey specialists to ensure that the questions asked will reveal sufficient information to evaluate the implementation of the SunWise Program and how it could be improved, especially by adding an incentives or “Levels of SunWise” recognition program. The student survey was previously pretested as described in Section III below.


In addition, EPA has funding to conduct the survey and provide the necessary analysis of the resulting data.


SECTION II – SURVEY DESIGN


2(a) Target Population and Coverage


A self-selected sample from all participating SunWise teachers will be used. SunWise teachers are diverse, with some in schools and others in recreation programs and other organizations.


2(b) Sample Design


School faculty and other educators register for the SunWise program through EPA. Registrants provide their name and contact information, including the name of their school/organization, and state whether they are a classroom teacher, health teacher, gym teacher, school nurse, or other. Recruitment emails will be sent to all registered SunWise schools and partners. Participants may also be recruited through additional avenues, such as recruitment letters distributed through the SunWise Tool Kit, educator conferences, or direct mailings. However, many will not participate in the survey.


2(b)ii Sample Size


EPA anticipates sending recruitment emails to more than 35,000 formal and informal educators, however only 1,300 are expected to actually participate in the survey. This number is based on previous survey participation.


2(b)iii Stratification Variables


None.


2(b)iv Sampling Method


As noted above, recruitment emails will be sent to all teachers who have registered for the SunWise Program since the program began in 1999. Because participation in the teacher survey is voluntary, the sampling method is voluntary self-selection.


Inclusion criteria: Signed up with the SunWise program.


Exclusion criteria: Incomplete Teacher Survey.


2(b)v Multi-Stage Sampling


None.


2(c) Precision Requirements


2(c)i Precision Targets


N/A


2(c)ii Nonsampling Error


N/A


2(d) Questionnaire Design


The Teacher Survey was derived from a SunWise instrument previously approved by OMB on November 2, 2001 and April 15, 2008 (ICR #1904.01 and #1904.04). It is based on the instrument approved by OMB in the most recent ICR (ICR #1904.04, April 15, 2008). The Teacher Survey was updated based on pretesting with nine teachers.


The student survey is derived from a SunWise instrument previously approved by OMB on November 2, 2001, and most recently approved on February 28, 2010.


SECTION III – PRETESTS AND PILOT TESTS


To pretest the revised SunWise Teacher Survey, fewer than nine teachers attending conferences that SunWise attended completed the survey and then participated in an interview with staff from EPA. The pretesting focused on the readability and understandability of the Teacher Survey. Teachers had no suggestions for revisions to the Teacher Survey. The survey was also time-tested to ensure completion in 20 minutes or less.


The pretesting of the student survey was conducted under the previous ICR (1904.04). It focused on the readability and understandability of the questions and possible responses; following the pretest, the survey was revised to: (1) include instructions for students to turn over the two-page, double-sided survey; (2) increase the font of multiple choice instructions; (3) put all questions referring to “last summer” together in a box at the end of the survey; (4) delete one question that students found difficult; (5) revise the wording of several questions to clarify question meaning; (6) add a new response choice for why students do not wear sunscreen; and (7) increase the response scale for several questions from a three-point to a five-point scale.


SECTION IV – COLLECTION METHODS AND FOLLOW-UP


4(a) Collection Methods


Teacher Surveys are not anonymous and are administered online.

All student surveys will be anonymous. Student surveys are administered in the classroom setting, and conducted by the teachers; thus, it would not be feasible for the teachers to obtain consent from the parents and assent from the children for a classroom teaching tool. In addition, all student surveys are anonymous, and thus no specific information on a child can be reported to parents or school staff. Student surveys will be returned to EPA by one of several methods: scanned and emailed; faxed; or sent through U.S. Postal Service using a self-addressed, postage-paid envelope supplied by EPA.


4(b) Survey Response and Follow-Up


The target response rate is approximately 3 to 4 percent among all teachers registered for the SunWise Program, although some of the teachers may no longer be teaching the Program. Actual response rate will be measured based on the number of teachers that submit surveys divided by the number of total teachers signed up for the program. No additional follow-up will occur unless there are questions with the survey, or if additional clarification is necessary on suggested improvements.


SECTION V – ANALYZING AND REPORTING SURVEY RESULTS


5(a) Data Preparation


Teacher Survey data will be entered into a database hosted on the EPA server.


All student survey data will be entered into a database, including surveys with questions that have not been completed. A double-entry protocol will be observed throughout data entry to ensure accuracy.


5(b) Analysis


The data obtained through this survey will be reviewed and analyzed using descriptive statistical methods in the aggregate for the purpose of determining satisfaction; frequency of use; number of students participating; number and types of activities taught; school policy change; student and teacher knowledge, attitudes and behavior; ways to improve the program. All of this information will give EPA insight into how best to improve the program and how the program is being used. The results will not be generalizable to the total pool of SunWise teachers or students.


5(c) Reporting Results


The results of the survey will not be written up formally, rather they will be used internally by the SunWise program to understand how the SunWise program is being implemented, and how it can be improved. The raw survey data will be maintained by EPA. EPA will share the information with a contractor, but will remain unavailable to the public.



Part B(ii) of the Supporting Statement – Teacher Interviews


SECTION I – SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER

PRELIMINARIES


1(a) Survey Objectives


EPA’s SunWise Program provides sun protection education via a standardized curriculum to school children in grades K-8 in public, parochial, and charter schools. More than 25,000 schools and 3,000,000 children have received SunWise education since the 1999-2000 school year. EPA proposes to conduct customer satisfaction and process-related evaluative surveys with the teachers using the program. The teacher telephone interviews will gather qualitative information regarding:


  • Teachers’ involvement in the SunWise Program;

  • If school policies are being changed as a result of SunWise and how to overcome barriers to those changes;

  • SunWise activities and resources teachers feel are most effective;

  • How teachers’ approach to teaching SunWise activities has changed over time;

  • How the SunWise Program can more effectively disseminate its materials and recruit teachers;

  • How the SunWise Program can encourage teachers to increase their involvement in SunWise and their promotion of sun safety in schools; and


  • Receptiveness to an incentives or “Levels of SunWise” educator recognition program, and ideas for making such a program successful.


The information will be analyzed and results, although not generalizable, will indicate how the Program is being implemented and how it can improve.


1(b) Key Variables


Involvement; school policy change; frequency of use; number and types of activities taught; most effective activities and resources; interest in a recognition program; motivation for increased participation; effort involved in recognition program.


1(c) Statistical Approach


The primary objective in conducting the SunWise teacher interviews is to gather qualitative information on how the SunWise program is being implemented, and how it can be improved. Since the results are not intended to be generalizable to the complete pool of SunWise teachers, no statistical approach is needed.


1(d) Feasibility


EPA has reviewed the administrative procedures necessary to conduct the SunWise teacher interviews and has determined that it is feasible to continue with the interviews. In addition, EPA has funding to conduct the interview and provide the necessary analysis of the resulting data.


SECTION II – SURVEY DESIGN


2(a) Target Population and Coverage


The target population consists of teachers that have taught the SunWise Program in the past two years; these teachers span the United States.


2(b) Sample Design

2(b)i Sampling Frame

The sampling frame consists of all teachers registered for the SunWise Program. Interview participants will be recruited via a screening email to registered SunWise educators asking (a) whether they have taught SunWise in the past two years; (b) how many years they have been teaching SunWise; and (c) whether they are willing to both complete an online teacher survey and participate in a one-on-one telephone interview.


2(b)ii Sample Size


EPA anticipates 200 teachers will respond positively to both screening questions (a) and (c) described in section 2(b)i above; however only 50 will be selected to participate in the interview process.


2(b)iii Stratification Variables


None.


2(b)iv Sampling Method


Interview participants will be recruited via a screening email to registered SunWise educators asking (a) whether they have taught SunWise in the past two years; (b) how many years they have been teaching SunWise; and (c) whether they are willing to both complete an online teacher survey and participate in a one-on-one telephone interview.


Teachers responding positively to both screening questions (a) and (c), will be grouped by region and length of participation, and across these groups 50 teachers will be randomly selected to participate in the interview process. To the extent possible, the selected teachers will represent the geographical and participation range of those teachers responding positively to the screening questions (i.e., a sample of a sample), though the sample will not necessarily be representative in a statistical sense. The teachers that are not selected will be encouraged to take the online survey, but will not be part of the group that will be individually interviewed.


Inclusion criteria: Registered with the SunWise Program; has taught SunWise in the past two years; indicated willingness to participate in an interview through the screening email.


Exclusion criteria: Incomplete Teacher Survey.


2(b)v Multi-Stage Sampling


None.


2(c) Precision Requirements


2(c)i Precision Targets


N/A


2(c)ii Nonsampling Error


N/A


2(d) Questionnaire Design


An interview guide with topics of discussion (Attachment 4) was developed by a team of contractors, a grantee, and EPA staff, and reviewed by educational and survey experts.


SECTION III – PRETESTS AND PILOT TESTS


Given the semi-structured approach to the teacher interviews and the qualitative nature of the study, pilot testing is not needed. The interview guide was reviewed by educational and survey experts to ensure that the questions asked will reveal sufficient information to evaluate the implementation of the SunWise Program and how it could be improved, especially by adding an incentives or “Levels of SunWise” recognition program.


SECTION IV – COLLECTION METHODS AND FOLLOW-UP


4(a) Collection Methods


Selected participants will be asked to complete the online teacher survey, as provided in Attachment 2 and described in Part B(i) of the Supporting Statement. After the participants have completed the online survey, EPA will arrange a convenient time to interview each of the selected teachers via the telephone.


Teachers will participate in one online survey and one telephone interview per year over a three year period, with slightly different informational goals for each year. In the first year, the interview will include discussion about the development of an educator recognition or incentives program, while interviews in later years may focus on other areas of interest, such as parental involvement. Teachers may receive an annual incentive for their participation in the survey and interview, such as a $25 gift certificate to a bookstore of their choosing to purchase classroom resources.


4(b) Survey Response and Follow-Up


The target response rate is approximately 80 percent, with a total of 50 teachers participating in the interviews. Actual response rate will be measured based on the number of teachers that participate in the interviews divided by the number of teachers that indicated interest and were selected to participate. Follow-up emails and telephone calls will be made to all teachers who were selected to participate but have not responded to an initial interview invitation. These follow-ups will explain the importance of the interviews and strongly encourage teachers to participate.


In addition, if fewer than 50 positive responses to the initial screening questions are received, EPA will follow up with an additional recruitment email encouraging participation and may also recruit participants through conferences, direct mailings, and other means.


SECTION V – ANALYZING AND REPORTING SURVEY RESULTS


5(a) Data Preparation


The interviewer and/or assistant will take notes during the interview.


5(b) Analysis


The information obtained through these interviews will be analyzed qualitatively through a systematic content analysis to identify key findings based on response frequency and emphasis. While these findings will give EPA insight into how best to improve the program, and how the program is being used, the results will not be generalizable to the total pool of SunWise teachers.


5(c) Reporting Results


The results of the qualitative content analysis will be written up in a summary report, which may also be shared on the EPA website and with partners and interested parties. The individual interview notes will be maintained by EPA and/or an EPA contractor and will remain confidential.


Part B(iii) of the Supporting Statement – Certification Program


SECTION I – SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER PRELIMINARIES

1(a) Survey Objectives

To expand the SunWise Program beyond the formal classroom, EPA has developed an online tutorial and certification questions to educate outdoor recreation staff who supervise teens and pre-teens about the importance of sun safety. By way of this voluntary tutorial/certification program, EPA looks to educate outdoor recreation staffers about the importance of sun safety for them and the youth in their care. This can be done through a number of ways including smart-scheduling, positive role-modeling and other policy changes at individual camps, pools or centers.

Upon completion, each user will be able to email their certificate of accomplishment to their supervisor. Whole camps will then be able to state that their staff has taken the EPA Sun Safety Certification, meaning they understand the importance of sun safety and know how to prevent sun damage in children.

Survey questions are included throughout the tutorial and must be answered to advance. EPA intends to meet two main objectives. The primary objective is to determine the current sun protection knowledge, attitudes, practices, intended practices, and teaching habits of outdoor educators, as well as basic demographic information. Responses to the tutorial/certification questions will also determine the environmental conditions (i.e., policies) already in place to minimize UV damage to the staff and visitors, and the perceived cultural norms of camp staff. This information will help inform SunWise program decisions such as framing a sun safety message and developing materials for pre-service teachers and outdoor educators to promote sun protection outside of the formal classroom.

The secondary objective is to gather feedback on the usefulness of the certification program. This will help the Agency determine if online media is an effective method of teaching sun safety lessons, and whether additional resources are needed for outdoor recreation facilities.

1(b) Key Variables

Key variables considered in this tutorial/certification program are demographic information about the educator (e.g., age, gender, education), the age range of children being supervised by the educator, the educator’s sun safety practices and intended practices, and policies related to sun safety implemented by the educator’s program or organization.

1(c) Statistical Approach

The primary objective of the tutorial/certification program questions is to measure the sun protection attitudes, practices, intended practices, and teaching habits of outdoor educators. Every outdoor educator that participates in the tutorial/certification will be asked to respond to the questions that are built into the online program. It is not practical, however, to require all outdoor educators to participate in the tutorial/certification program, so participation will be voluntary and self-selecting. From this group, the Agency will be able to draw conclusions about outdoor educators in general. Anecdotal information is not sufficient for this purpose, and thus EPA has chosen a standardized questionnaire-based approach for this survey.

The tutorial/certification program asks questions about outdoor educators’ sun protection attitudes, practices, intended practices, and teaching habits. EPA intends to have each educator only complete the survey once during the period for which this ICR is in effect. An analysis of these results will give a snapshot of outdoor educators’ behaviors, and will help the Agency to better understand the challenges associated with promoting sun safety in outdoor education programs.

1(d) Feasibility

EPA has determined that an online, voluntary, self-selecting tutorial/certification program is a feasible way to gather information regarding sun protection in an outdoor education setting. In addition, EPA has developed the online tutorial/certification program, and has funding to collect and review responses and perform necessary updates to the program.

SECTION II – SURVEY DESIGN

2(a) Target Population and Coverage

The target population consists of outdoor recreation staff members that supervise teens and pre-teens. Because the tutorial/certification program is self-selecting and voluntary, the coverage of this population is not known.

2(b) Sample Design

2(b)i Sampling Frame

The sampling frame is all U.S. outdoor recreation staff members at programs included in the mailing lists from the American Camp Association (ACA) and the National Recreation and Park Association (NRPA), which are updated regularly. The sampling frame will be identified by the purchase of these e-mail lists.

Survey participants will be recruited via a targeted email to all programs in the ACA and NRPA lists, which will direct participants to the EPA website. Advertisements for the tutorial/certification program will also be placed on the EPA SunWise Web site and ACA and NRPA Web sites. In addition, participants will be recruited at conferences related to sun safety and education, and EPA may also reach out to sports coaching associations for participant recruiting.

2(b)ii Sample Size

Because the tutorial/certification program is self-selecting and voluntary, it is not possible to pre-determine a specific sample size. However, EPA anticipates about 1,000 outdoor recreation staff members will complete the tutorial/certification program.

2(b)iii Stratification Variables

None.

2(b)iv Sampling Method

Because participation in the tutorial/certification program is voluntary, the sampling method is voluntary self-selection. Because the survey questions must be answered to advance through the tutorial, all outdoor educators that elect to participate in the certification program will necessarily complete the survey. There is sampling bias associated with self-selection on both an individual and an organizational level as camps or programs may require all of their educators to take the tutorial; however, because participation in the certification program is voluntary, self-selection is the only feasible sampling method available to the Agency.

2(c) Precision Requirements

2(c)i Precision Targets

Based on an assumed sample size of 1,000 participants, the Agency’s precision target is ±3.1 percentage points at a 95 percent confidence level for aggregated question responses (i.e., the estimated percentage of all potential respondents that give a certain answer to a given survey question). For question responses for sample subsets (e.g., the percentage of male respondents that give a certain answer to a given survey question), the uncertainty in the estimated percentages will be greater.

2(c)ii Non-sampling Error

It is expected that the largest non-sampling error will be the result of non-response.

2(d) Questionnaire Design

The tutorial/certification questions were derived from those developed in a 2008 study by Glanz et al.2 In Glanz et al. (2008), a group of investigators evaluated available questionnaire measures of sun exposure and protection in order to propose a core set of standardized survey items for a range of age groups (adults, adolescents aged 11 to 17, and children 10 years or younger). The investigators used these core questions in cognitive testing and found that they had good clarity and applicability for measuring sun exposure and sun protection behaviors across a broad range of populations. In addition, it was determined that these methods are appropriate for studies tracking morbidity and/or mortality and evaluating prevention program effects.

  • Based on this study, participants are asked a series of questions throughout the tutorial about themselves, their sun protection-related behavior and their experience with the tutorial. These questions are meant as a self-assessment tool and must be answered in order to complete the tutorial. Participants are encouraged to think about how they can improve their current behavior while answering. The questions are incorporated into each of the five sections of the on-line tutorial/certification program: Introduction: provides a brief overview of the tutorial.

  • UV Basics: emphasizes the importance of knowing the facts about UV radiation.

  • Louder than Words: describes sun safe actions and provides information on sun protection attire and SPF sunscreen.

  • Do As I Do: provides insight to participant’s ability to influence young people, and offers suggestions for modeling sun safe behaviors.

  • Before You Go: summarizes information from the tutorial, provides links to additional resources, and presents questions about participant’s intentions to follow suggestions presented in the tutorial.

The training is expected to take approximately 45 minutes to complete, of which approximately 7 minutes will be for answering the survey questions.

SECTION III – PRETESTS AND PILOT TESTS

No pilot testing of the tutorial/certification questions is planned.

SECTION IV – COLLECTION METHODS AND FOLLOW-UP

4(a) Collection Methods

Tutorial/certification program participants will complete the tutorial/certification online through the EPA SunWise Program website, and responses will be automatically submitted electronically upon completion. All tutorials/certifications are anonymous, and no personal information will be stored.

4(b) Survey Response and Follow-Up

Because outdoor educators self-select to complete the tutorial and certification questions, no response rate will be specifically measured, although EPA will keep count of the number of respondents. No follow-up will be performed with outdoor educators that complete the tutorial/certification program.

SECTION V – ANALYZING AND REPORTING SURVEY RESULTS

5(a) Data Preparation

No personal information will be stored. Responses to questions presented in the tutorial will be collected and stored automatically upon completion, and will be reviewed by EPA to help determine necessary changes to make to the tutorial and program.

5(b) Analysis

Data obtained through the tutorial/certification will help EPA gain a better understanding of adjustments to make to the SunWise tutorial/certification program to better educate outdoor recreation staff. In addition, responses will provide EPA with an idea of the types of sun safety policies implemented in outdoor education programs, policy effectiveness, as well as possible areas of improvement to increase sun protection.

Basic descriptive statistical analyses will be used to describe and present a basic summary of responses collected and organize them in a logical manner. Data will be aggregated to calculate, for example, the percentage of outdoor educators that practice a certain sun safety behavior, or the mean number of times outdoor educators remind teens and pre-teens in their care about sun protection, along with 95% confidence intervals. Data will also be parsed for comparison by different demographic or other categorical variables in order to compare, for example, the percentages of male versus female outdoor educators that remind children to protect themselves from the sun. Differences in percentages will be calculated together with 95 percent confidence intervals and statistical tests to determine whether the differences are statistically significant.

5(c) Reporting Results

Information from tutorial/certification responses will serve to inform EPA of program areas needing improvement to better reach the target audience, as well as increase sun safety awareness and practices in informal and outdoor education settings.

Raw survey data will be kept confidential. EPA plans to share aggregated findings with partners and the public to improve sun safety education.


Part B(iv) of the Supporting Statement – Pretesting the Partner Survey


SECTION I – SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER

PRELIMINARIES


1(a) Survey Objectives


EPA’s SunWise Program provides sun protection education via a standardized curriculum and other resources (i.e., Sun Safety Tutorial) to children in grades K-8 in communities at registered 501(c)(3) organizations such as science centers and camps, children's museums, and scouting groups, as well as other not-for-profit organizations like local, county and state health, recreation and education departments. About 4,600 informal education centers have registered with SunWise since the 1999-2000 school year.


EPA will undertake pretesting of a survey for non-school partners participating in the SunWise Program. These partners may include state and local health departments, childcare centers, museums, camps, and science centers. The purpose of the survey will be generally to better understand how non-school partners are interacting with the SunWise Program, as well as to determine: how and how often partners are using the SunWise materials, resources and programming;


  • How many children are receiving SunWise education through non-school partners;

  • Children’s satisfaction with SunWise activities and resources;

  • Partners’ satisfaction with SunWise activities and resources;

  • If partners are sharing resources with other partners;

  • If partner organizations’ sun safety policies are being changed as a result of SunWise;

  • If partners have suggestions for improving or creating new SunWise resources.


The pretesting is intended to determine the validity, reliability, and effectiveness of the survey questions—e.g., whether questions measure what they are supposed to measure, whether partners understand what the questions are asking are asking, and whether the questions are the right questions to gain a better understanding of how partners are interacting with the SunWise Program.


1(b) Key Variables


Understandability, validity, reliability, and effectiveness of questions; right questions for understanding partner participation in SunWise.


1(c) Statistical Approach


The primary objective of the pretesting is to gather information that will help EPA develop a new questionnaire for SunWise non-school partners, and to determine the validity, reliability, and effectiveness of initial draft questions. No statistical approach is needed.


1(d) Feasibility


EPA has reviewed the administrative procedures necessary to pretest the SunWise partner survey and has determined that it is feasible to continue with the pretesting. EPA has funding to conduct the pretesting and use the resulting input to revise the partner survey.


SECTION II – SURVEY DESIGN


2(a) Target Population and Coverage


The target population includes all non-school partners registered with the SunWise Program, or approximately 4,600 organizations. These organizations include 501(c)(3) organizations such as science centers and camps, children's museums, and scouting groups, as well as other not-for-profit organizations like local, county and state health, recreation and education departments.


2(b) Sample Design

2(b)i Sampling Frame

Partners register for the SunWise program through EPA. Registrants provide their name and contact information, including the name of their organization, and state what their role is. EPA will send a recruitment email to all registered SunWise partners in the Spring/Summer timeframe encouraging them to participate in the pilot testing of the partner survey (Attachment 5). Participants may also be recruited through additional avenues, such as recruitment letters distributed through the SunWise Tool Kit, educator conferences, or direct mailings. However, many may choose not to participate.


2(b)ii Sample Size


EPA anticipates sending a recruitment email to about 4,600 non-school partners, however only 30 will be selected for participation in the survey pretesting.


2(b)iii Stratification Variables


None.


2(b)iv Sampling Method


From partners responding positively to the recruitment email, EPA will sort the willing partners into types of organizations (e.g., health, recreation, and educational departments; childcare centers; camps and scouting groups; and educational centers such as museums or science centers) and randomly select participants from each group for a total of 30 participants. The number of participants selected from each group will be weighted based on the overall composition of SunWise registered partners.


Inclusion criteria: Registered with the SunWise program.


Exclusion criteria: Non-response.


2(b)v Multi-Stage Sampling


None.


2(c) Precision Requirements


2(c)i Precision Targets


N/A


2(c)ii Nonsampling Error


N/A


2(d) Questionnaire Design


The partner survey for pretesting derived from a SunWise teacher instrument previously approved by OMB on November 2, 2001 and April 15, 2008 (ICR #1904.01 and #1904.04). Because this information collection consists of pretesting with the goal of developing a new survey for SunWise non-school partners, the design of the survey will necessarily change through the pretesting process and additional or different questions may be pretested with partners. The draft survey for pretesting is provided in Attachment 5.


SECTION III – PRETESTS AND PILOT TESTS


This information collection consists of pretesting for the development of a new survey for SunWise non-school partners.


SECTION IV – COLLECTION METHODS AND FOLLOW-UP


4(a) Collection Methods


Depending on available resources and other constraints, the survey may be self-administered with feedback gathered from each participant over the telephone, or the survey may be administered in-person either in an individual or group setting, with feedback gathered through in-person interviews. In either case, participants will be asked if they understood all questions, whether the survey was easy to complete, and whether there were questions they would suggest removing or adding to better reflect the participation of partners in the Program. Based on this feedback, EPA will revise the partner survey. EPA will also measure the time it takes for each respondent to complete the survey (estimated at less than 20 minutes).


4(b) Survey Response and Follow-Up


The target response rate is approximately 80 percent. Actual response rate will be measured based on the number of partners that participate in the pretesting divided by the number of partners that indicated interest and were selected to participate. Follow-up emails and telephone calls will be made to all partners who were selected to participate but have not responded to an initial invitation. These follow-ups will explain the importance of the pretesting and strongly encourage partners to participate.


SECTION V – ANALYZING AND REPORTING SURVEY RESULTS


5(a) Data Preparation


All survey data will be entered into a database. A double-entry protocol will be observed throughout data entry to ensure accuracy.


During interviews with partners following pretesting of the survey, the interviewer and/or assistant will take detailed notes.


5(b) Analysis


The survey data and qualitative input obtained through the pretesting will be aggregated and analyzed for the purpose of determining the validity, reliability, and effectiveness of survey questions. All of this information will be used by EPA to revise the survey questions.


5(c) Reporting Results


The results of the pretesting will be used by EPA to revise the partner survey. The raw survey and interview data will be maintained by EPA and/or an EPA contractor and will remain unavailable to the public.



2 Glanz, K, Yaroch, AL, Dancel, M, Saraiya, M, Crane, LA, Buller, DB, Manne, S, O’Riordan, DL,

Heckman, CJ, Hay, J, Robinson, JK. Measures of Sun Exposure and Sun Protection Practices for Behavioral and Epidemiologic Research. Archives of Dermatology. 2008; 144 (2); 217-222.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleEducator Survey
Authorctsuser
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy