1024-SAMO_Part_B_11-6-2012

1024-SAMO_Part_B_11-6-2012.docx

Assessment Tools for Park-Based Youth Education and Employment Experience Programs at Santa Monica Mountain Recreation Area

OMB: 1024-0264

Document [docx]
Download: docx | pdf

Supporting Statement B


Assessment Tools for Park-Based Youth Education

and Employment Experience Programs

at Santa Monica Mountains National Recreation Area


OMB Control Number 1024-NEW


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The Santa Monica Mountains National Recreation Area (SAMO) is the nation’s largest urban national park located adjacent to the second largest urban area in the United States, the city of Los Angeles, CA. This unique setting provides a variety of outdoor activities for over 33 million annual visitors. National Parks are places where people of all ages can learn about biology, ecology, history, geology, anthropology, and more. We plan to collect information from an estimated 1,224 respondents each year. The park hosts at least 100 classroom field trips each year and this is the universe from which the respondents (including teachers and students) will be drawn.


Target Populations: The target population for the survey population varies by program but most participants are school children and their teachers from the private and public schools within the school districts in Los Angeles and Ventura counties. Respondents to the SAMO Youth Alumni survey, while originally from the LA region, are now dispersed throughout the region and nation. The EcoHelpers survey samples a diverse population, which includes a wide range of ethnicities, as shown in Appendices A and B.

Sampling Units: The sampling units for each program are as follows:

  • SHRUB- all fifth grade students and their teachers from three elementary schools participating in the science - based SHRUB program.

  • EcoHelpers - the sampling units are high school students and their science teachers (grades 9-12) from high schools in the 13 local school districts in the region participating in the EcoHelpers site visits.

SAMO Youth and Alumni - the sampling units are individuals who are current and previous participants of this youth employment program over the past eleven years.


Sample Frame: The sampling units for each program are as follows:


  • SHRUB: For the purposes of this collection, we will survey all the fifth grade students and teachers participating in the SHRUB program. The program hosts three elementary schools. Two of three schools are within the immediate vicinity of the park while the third school is located an hour away in Los Angeles. We will survey all students and teachers participating in the 2013-2014 SHRUB program (n = 240) and we anticipate an 80% response rate among students (accounting for any absent students during either the pre or the post survey administrations) and a 100% response rate from the teachers responding to the post-visit survey. This is a population survey, not a sample.


  • EcoHelpers: Approximately 60 high school classes participate in the EcoHelpers program in a given school year (n= 1,800 students estimating 30 students per class). We will randomly select 15 high schools classes (25% of the total population) to participate during the 2013-2014 school year (n = 450). The sampling procedure for EcoHelpers will be a simple random selection (every 4th class) with replacement of selected teachers who decline to participate (e.g., selecting the next teacher on the list). Since teachers sign up continually throughout the school year, a total sample of classrooms cannot be drawn at the beginning of the school year. At the beginning of each school year, the first class selected will be determined by a simple coin toss to determine whether to start with the first or second classroom/teacher on the list. The sampling procedure will continue, by selecting every 4th classroom until 15 classes have been selected. We feel that this will provide a representative sample and one that will also be easy for park staff to manage. We anticipate an 80% (n=12) response rate from EcoHelpers teachers and an 80% (n=360) response rate from their students.

  • SAMO Youth Program

Youth Survey: Approximately 10 students participate in this program each year. We will ask all participants to take the pre-survey prior to starting the program and the post-survey at the prior to their last day of participation in the program. We expect 100% (n=20) of the respondents to complete the two surveys.


Alumni Survey: Park staff has maintained contact information for all former participants (n=120). We will use this information to initiate the survey process. We will send a letter inviting the alumni to participate in the survey; this letter will serve as our address check method to screen for bad addresses. We anticipate that 67% (n=80) of the program participants will respond to the survey request.


The estimated response rates (Table 1) are based on results from similar efforts undertaken by members of the study team. This will be a population survey, not a sample.


Table 1. Respondent universe and expected number of annual responses


Program Participants

Total number of potential participants

Total number of expected responses

SHRUB Program

Pre visit survey

Student



240



192

Post-visit Surveys

Student

Teachers


240

8


192

8

EcoHelpers Program

Pre visit survey

Student

Post-visit Surveys

Student

Teachers



450



450

15



360



360

12

SAMO Program

Pre and Post Youth Survey

Alumni Survey


20

120


20

80

TOTAL

1,543

1,224

Expected Response Rates


SHRUB Program

The expected response rate for SHRUB Program is 80% (n=192), allowing for any students unavailable during the data collection periods due to absence or other reasons. We expect all (100%) of the teachers will complete the survey. These estimates are based on previous research in similar classrooms and discussions with the program teachers.


EcoHelpers Program

We anticipate an 80% response rate (n= 360) among students and teachers (n=12). These expected response rates are based on similar studies conducted by the Center for Education and Evaluation Studies (CEES) where high school students and teachers were surveyed as part of their normal classroom activities. This expectation allows for students and teachers who are unavailable during the data collection periods due to absence or other reasons.


SAMO Youth Program

Youth Survey: We will ask all participants (n=10) to take the pre-survey prior to starting the program and the post-survey before their last day in the program. We expect a 100% response rate.


Alumni Survey -We expect a 67% response rate (n=80). Over the past 11 years, there have been a total of 120 SAMO Youth participants. We will attempt to contact all of these students for the Alumni Survey. Assuming that we have valid contact information for 100 alumni (83%) and that 80% of those contacted will respond, we expect to receive 80 completed responses. While general social science research suggests that a 30% response rate is normal for this type of data collection, it is our assumption, based on park staff personal experience and on-going communication with previous students, that prior SAMO Youth will be likely to respond to the survey.


2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The sample sizes chosen for each segment of this study will result in a +/- 4% margin of error (at the 95% confidence level). This degree of accuracy will be more than sufficient to meet the needs of this study. Unusual problems are not anticipated and periodic data collection in order to reduce burden will not occur. The general strategies detailed above are consistent with accepted survey practice (Dillman 2010).


SHRUB Program

Classrooms participating in the SHRUB program are determined prior to the beginning of each school year. The SHRUB program accepts 5th grade classrooms from EARTHS Magnet School, Third Street Elementary, and Conejo Elementary School. There will be no sampling for SHRUB – the entire population of teachers and their students participating in the program will be invited to respond to the surveys in this collection. We anticipate that all teachers will respond to the survey and 80% of their students will do the same.

EcoHelpers Program

At least four weeks prior to the field trip, teachers are required to register for at least three preferred field trip dates. We will assign the dates and use the registration list to randomly sample the participants for the study. The starting point will be either the first or second class on the list to be determined by a simple coin toss at the start of each school year. After the first class is selected, every fourth class from that starting point will be selected until a total of 15 classes have been scheduled to participate in the study. If selected teachers decline to participate, the next classroom on the list will be substituted. This will result in a sample of 25% of all participants or about 360 respondents, which yields a confidence interval of +/- 4% at a 95% confidence level.


It is anticipated that both teachers and students in the EcoHelpers and SHRUB programs will complete the survey during normal classroom hours, using either electronic or a hard copy formats. We will offer hard copies of the survey as an alternative mode because we have been informed that several of the schools in our population lack computer labs with internet access. Teachers will be asked which format they would like to use.


We will contact the teachers who agreed to participate via email (or mail) approximately two weeks before their class is scheduled to participate in the program (pre-visit survey) and one week after they complete the on-site program (post-visit survey). They will have the option of completing the on-line or mail-back version of the survey. All hard copy surveys will be accompanied by a pre-addressed return envelope. Teachers who do not complete the on-line post-survey and/or whose students do not complete their post-surveys within 1 week of completing the program will receive an email reminder the following week (2 weeks from completion of the program). NPS will mail or email a follow-up reminder to teachers who do not respond (or who have not returned their student surveys) within 3 weeks of completion of the program and may also phone non-responding teachers to ensure that surveys have indeed been delivered and ascertain whether any additional assistance from NPS is needed to facilitate return of completed surveys. Finally, three weeks after the second survey mailing or electronic reminder, the data collection period will conclude.


SAMO Youth Program

This will be a population survey. All SAMO Youth and alumni will be invited to participate in the survey. Based on park staff personnel experience and on-going communication with program participants, we anticipate that SAMO participants will be motivated to respond to the survey.


Because we have not yet tested the accuracy of available contact information we cannot accurately estimate what proportion of the SAMO Youth alumni will actually be “reachable.” For purposes of this submission, we are assuming that 67% of the former participants will be both “reachable” and will respond.


3. Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


We will use the following methods to maximize the response rate for this collection:


  • Careful Survey Design and Pre-Testing- The survey was developed and rigorously pre-tested. The questions are worded in a manner that is easy to understand, grade level appropriate, and organized in a logical order. See details in section 4 below.


Best-Practice Implementation Sequence- We will follow standard survey practice, to ensure the best response rates:


  • Pre-visit survey request letters on NPS letterhead, signed by the Park Program Managers for each of the programs, will be sent to all teachers (SHRUB and EcoHelpers) and to the SAMO Youth program alumni. The letters will explain the purpose and significance of the survey, and the procedures for completing the electronic version or requesting hard copies of the questionnaire.

  • A copy of letter explaining the content and purpose of the survey to parents will be sent to participating teachers to be distributed to their students’ parents/guardians along with their field trip permission packets.

  • SurveyMonkey™ will automatically notify all non-respondents (via email to teachers) 5, 10 and 15 days following initial on-line survey notice. During this time, the survey team will send a reminder notice and second copy of the survey to all adult non-respondents (teachers & SAMO Youth and Alumni) requesting the mail- back version of the survey.

  • Three weeks after the second survey mailing or electronic reminder, the data collection period will conclude.


Identifying Possible Nonresponse Bias


We propose three procedures for investigating potential nonresponse bias in this study:


  1. Late Responders- We will compare respondent characteristics (e.g., demographics, school location – see Appendices A and B) across individuals or classes who returned their surveys at different times during the data collection period. We will compare individuals/classes who returned their surveys after the first mailing or invitation versus the second and third mailing/reminders. Although all of these individuals are considered to be responders, those who respond later may share important characteristics with non-responders.

  2. Non-respondent Follow-Up Surveys- Because the student surveys will be administered by the teachers as a classroom assignment, we anticipate a very low non-response rate. In the event that a teacher refuses to participate in the survey, we will contact the teacher to determine the reason for their non-participation (which will be recorded). Once addressed we will either resend the survey information to the teacher (if they have changed their mind about participating) or randomly select the next classroom (for EcoHelpers) in the sample to replace the non-respondent. Because we are surveying the entire population for SHRUB and SAMO Youth alumni participants, there is no replacement strategy for non-respondents in these programs.

Because SHRUB and EcoHelper respondents participate in a pre and post survey, we will carefully explain to selected teachers that there is an expectation that the same teachers will work with their students to complete both the pre-and post-visit surveys. We will explain that the pre-visit surveys must be completed and returned prior to the field trip and that they should be prepared to administer the post-surveys within two weeks following their fieldtrip. The participating teachers will be expected to complete the post-visit survey.

We do not have a separate non-response follow-up survey. However, if the numbers of non-respondent pre- and post- visit student surveys vary widely (by more than 30%) for any individual classroom, research staff will contact the classroom teacher to determine if the non-response represents a systematic bias. This contact will also be used to ask the teacher if all post-surveys have in fact been sent or if they will consider administering the post-survey to a sub-group (n=10% of students who attended the field trip). We will use the responses to determine how the sub-group may have differed from the full group.

  1. Non-respondents among SAMO Youth Alumni

Alumni Survey– As described in earlier sections, we anticipate some non-response bias in the initial administration of the SAMO Youth alumni survey due to the greater likelihood that we may have inaccurate contact information for participants from the early years of the program. Therefore we expect to receive more responses from recent participants and the data may not reflect some of the possible deficits in the early years of the program.

We will test for any statistically significant differences in demographics between (a) early and late respondents and, (b) where demographic information is available, between respondents and non-respondents. We will also examine, for the programs involving school field trips, response rates as they relate to school level characteristics (e.g., demographics, achievement patterns) and paper vs. web-based administration to assess potential non-response bias. Finally, we will examine patterns of refusal to participate (in school-based programs) to determine if there is evidence of a systematic bias related to school location or other available information about teachers or schools. Expressed reasons for declining to participate will also be examined to inform future survey administrations.


Multiple Options for Completing the Survey


Respondents will have the options to either fill out an online or paper version of the survey. It is becoming increasingly accepted to administer online surveys as an alternative to other survey modes such as mail or telephone (Couper, 20001). Researchers often use online surveys to decrease costs, increase the speed of data collection, increase response rates by providing additional modes for response, and decrease the amount of non-response error, and to reduce data entry errors (Dillman, 20102; Schaefer and Dillman, 19983).


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


We have worked with the University of California, Davis: Center for Education and Evaluation Services (CEES) to design and develop instruments associated with this collection. The CEES is a full-service program evaluation center providing consultation, telephone surveys, mail surveys, and web-based surveys, statistical analyses, and other essential data functions for virtually every area of social research, evaluation, and policy analysis. The Center (CEES) has conducted a wide variety of projects with local, state, or federal agencies in a number of policy and education content areas.


The Initial drafts of the instrument were developed and informed by an extensive review of similar surveys with similar populations and goals and collaboration between the UC Davis study team and SAMO Program staff. All versions of the survey instruments were reviewed and revised based on feedback made by the SAMO Program staff and CEES design team.


In the spring of 2011, UC Davis pilot-tested draft versions of each survey with a small sample of local Sacramento area students and teachers (n=<9). Each participant completed a hard copy version of the survey and was asked to review and provide comments concerning the overall structure, sequence, time to complete, and clarity of questions. The key objectives of the pre-test were to evaluate the respondents’ ability to understand the questions and determine whether survey and design parameters functioned properly prior to implementation. Pilot respondents were asked to identify ambiguous and/or confusing wording or instructions and the instruments were then modified based on their feedback and recommendations. The CEES design team then re-tested the modified instruments with another sample of SAMO program participants (teachers and students – n=<9), this time using the web-based format. Additional minor wording changes were made for clarity.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


  1. Dr. Theresa Westover, Director

Center for Education and Evaluation Services

University of California Davis

1 Shields Avenue

Davis, CA 95616

Ph: (530)754-9523

Email: [email protected]


  1. Alyssa Okita, Evaluation Analyst

Center for Education and Evaluation Services

University of California Davis

1 Shields Avenue

Davis, CA95616

Ph: (530)752-1350

Email: [email protected]


  1. Dr. Francisco J. Herrera Jr., Institutional Research Analyst

UC Santa Barbara, Graduate Division

3117 Cheadle Hall

Santa Barbara, CA 93106-2070

Ph: (805)893-4231

Email: [email protected]


  1. Stephanie Au

Program Evaluation Coordinator

Center for Education and Evaluation Services

University of California Davis

1 Shields Avenue

Davis, CA 95616

Ph: Phone: (530)752-2618

Email: [email protected]


Appendix A


EcoHelpers Program District Demographics


District

Hispanic or Latino

American Indian/Alaska Native

Asian

Pacific Islander

Filipino

African American

White

Two or More Races

Not Reported

Inglewood USD

58.36%

0.35%

0.38%

0.39%

0.31%

39.31%

0.46%

0.13%

0.32%

Los Angeles USD

73.36%

0.46%

4.03%

0.38%

2.12%

9.53%

9.04%

0.09%

0.99%

Santa Monica-Malibu USD

29.95%

0.26%

5.93%

0.27%

0.61%

6.50%

50.60%

5.44%

0.44%

Conejo USD

22.19%

0.55%

9.00%

0.20%

1.29%

1.55%

62.60%

2.63%

0.00%

Las Virgenas USD

8.56%

0.42%

7.12%

0.11%

0.68%

1.99%

78.97%

2.13%

0.02%

Moorpark USD

39.74%

0.31%

5.77%

0.25%

1.38%

1.51%

48.57%

2.45%

0.01%

Oak Park USD

5.78%

0.38%

14.11%

0.10%

0.86%

1.71%

75.80%

1.26%

0.00%

Ojai USD

33.19%

0.61%

2.04%

0.34%

0.72%

1.09%

60.78%

1.19%

0.03%

Oxnard Union High School District

72.38%

0.32%

2.56%

0.36%

4.23%

2.39%

16.81%

0.91%

0.04%

Santa Paula Union High School District

94.16%

0.38%

0.25%

0.19%

0.00%

0.25%

4.39%

0.38%

0.00%

Saugus Union School District

27.73%

0.30%

9.23%

0.27%

5.01%

3.75%

52.24%

1.42%

0.05%

Simi Valley USD

29.11%

0.52%

7.28%

0.14%

1.72%

1.14%

57.86%

2.20%

0.03%

Ventura USD

47.07%

0.61%

2.65%

0.25%

0.67%

1.54%

44.07%

3.15%

0.00%






Appendix B


SHRUB Program School Demographics


School

Hispanic or Latino

American Indian/Alaska Native

Asian

Pacific Islander

Filipino

African American

White

Two or More Races

Not Reported

Third Street Elementary

9.02%

0.54%

54.10%

0.27%

1.75%

7.67%

26.65%

0.00%

0.00%

Conejo Elementary

75.71%

0.00%

2.86%

0.00%

0.00%

2.38%

17.14%

1.90%

0.00%

EARTHS Magnet

27.51%

0.00%

12.83%

0.56%

0.93%

0.93%

52.79%

4.46%

0.00%





1 Couper, M. P. 2000. Web surveys: a review of issues and approaches. Public Opinion Quarterly, 64(4), 464-494.

2 Dillman, D.A., 2010, Mail and internet surveys-The tailored design method, 2nd ed: Hoboken, NJ, John Wiley & Sons, Inc.

3 Schaefer, D., and Dillman, D.A. 1998. Development of a standard e-mail methodology: Results of an experiment. Public Opinion Quarterly, 62(378-397).

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement B for
Authorswanne
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy