1024-NEW NPS Educators SSB_2_20-24

1024-NEW NPS Educators SSB_2_20-24.docx

National Park Service Survey of Educators

OMB:

Document [docx]
Download: docx | pdf

Supporting Statement B

National Park Service Survey of Educators


OMB Control Number: 1024-NEW



Collections of Information Employing Statistical Methods

1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

Respondent Universe

The respondent universe for this collection will include 3rd-5th grade teachers who have participated in NPS-sponsored NPS Education Programs in the past five years from 16 NPS sites.

To determine the NPS sites to participate in the study, the research team (NPS Interpretation, Education, and Volunteers Directorate, RRC Associates, and the University of Montana) conducted a call facilitated by the NPS that included representatives from NPS sites with educational programs. There were 74 participants (NPS educators, and NPS education managers) on that call, representing at least 30 NPS units. The research team explained the goal of the project and asked sites to email the research team if they would like their site to participate in the study. After a follow-up email to all sites, 16 NPS units confirmed their participation. The research team then communicated with each site to determine how many teachers could participate in the study (eligibility based on whether teachers have participated in 3rd- 5th grade NPS Education Programs in the past five years at the site). Table 1.1 provides details about participating NPS units and the teachers who are eligible for the study (respondent universe), based on the information received from the NPS sites.



Table 1.1. Respondent universe and sample for teachers who have participated in NPS Education Programs

NPS Unit

Respondent Universe

(# of eligible teachers)

Expected Response Rate

30%

Yellowstone National Park

500

150

Rocky Mountain National Park

90

27

Colorado National Monument

150

45

Great Smoky Mountains National Park

580

174

Hawai’i Volcanoes National Park

80

24

Joshua Tree National Park

70

21

Cuyahoga Valley National Park

100

30

Arches and Canyonlands National Parks

18

5

Santa Monica Mountains National Recreation Area

300

90

Natchez Trace Parkway

257

77

Andersonville National Historic Site

250

75

Klondike Gold Rush National Historic Park - Alaska

3

1

Abraham Lincoln Birthplace National Historic Park

50

15

George Washington Carver National Monument

555

167

Richmond National Battlefield Park

55

17

Maggie L. Walker National Historic Site

20

6

Total

3,078

924



The total respondent universe for the survey is 3,078. While this is a convenience sample, it represents parks of various geographical locations, park types (i.e., historic, natural, recreational), park sizes, and education program sizes. Therefore, it is representative and applicable to NPS education programs systemwide.

Response Rate

We anticipate at least a 30% response rate for this survey, as potential respondents have experience with the NPS and will receive an initial email from their affiliated NPS site explaining the goal of the study and the importance of their input. We will contact a total of 3,078 teachers and invite them to participate in the study. Assuming a 30% response rate, we expect 924 completed surveys. Of the 2,154 remaining teachers contacted, we anticipate that 25% (n=539) will answer the non-response bias questions. Table 1.2 shows the expected total contacts, number of respondents, and number of non-respondents.

Table 1.2. Anticipated Survey Response Rates

Total Number of Contacts  

Completed Surveys

(30% of contacts) 


Refusals

(70% of contacts)


Completed Non-Response Surveys (25% of refusals)


Hard Refusals

(75% of refusals)

3,078

924

2,154

539

1,615


Menon and Muraleedharan (2020) suggest that response rates to email surveys are highly variable and traditionally range between 25%-30%. They also argue that personalized invitations and similar strategies have been found to augment response rates. We will follow Dillman’s Tailored Design Method (Dillman et al., 2014) in providing multiple reminders to encourage response.


2. Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


2.1 Statistical methodology for stratification and sample selection

This is a convenience sample based on the willingness of parks to participate in the collection (Table 1.1). General information about the study was presented by the NPS and research team at a roundtable in December 2022, and distributed among the NPS education program network via newsletter. Sixteen parks of various geographical locations, park types (i.e., historic, natural, recreational), park sizes, and education program sizes are included in the sample.


2.2 Unusual problems requiring specialized sampling procedures

No unusual procedures beyond the efforts described above in 2.1.

2.3 Estimation procedure, and degree of accuracy needed for the purpose described in the justification 

For the sample in this collection effort, the research team will target more than 342 completed surveys to meet sample size needs (Sample Equation Below).


The example sample size (n) using the participating teacher sample was calculated using the formula1 below. We are targeting more than the minimum required completed surveys to ensure an adequate sample per each question, given some questions are not asked to all respondents.

Unlimited Population -


Finite Population Correction -

Where: Z score (based on 0.95 Confidence Level): z = 1.96

  • Population Proportion: = 0.5

  • Approximate Population Size (18+): N = 3,078

  • Margin of error: ɛ = 0.05


Given the equations and values above the finite population correction n’ = 342.


2.4 Any use of periodic (less frequent than annual) data collection cycles to reduce burden 


This is a one-time effort and there is no intent to collect additional information from the respondent after they submit the completed survey.

3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

To maximize response rates, strategies derived from Dillman’s Tailored Design Method (Dillman et. al. 2014) will be used, including an introductory letter, an initial email, and two email reminders. The survey will be administered online via the Qualtrics platform. All 3,078 eligible teachers will receive the introductory letter (via email) from the NPS sites’ key education program contact to establish trust and maximize response. A few days later, an initial email will be sent from the research team that introduces the survey and contains a question asking if they want to participate, along with ‘yes’ and ‘no’ response buttons that will take them to the full survey or non-response survey, respectively. Two weeks after the initial email, a reminder email will be sent to those teachers who have not responded to either the full survey or the non-response bias survey. Then, two weeks after the first reminder, a second and final reminder email will be sent to those who still have not responded.


Non-Response Bias Survey and Analysis
To collect non-response bias information, all potential respondents will be asked a question embedded directly in the email body: “Are you willing to take part in this 10-minute survey?” The options of ‘Yes’ or ‘No’ will be presented as click-links directly below the question. If the respondent clicks ‘Yes,’ they will be directed to the full survey. If they click ‘No,’ they will be directed to the non-response survey (below). To encourage response, two follow-up emails will be sent to those who have not yet responded. Soft refusals are those who click ‘No’ and complete the 3 non-response questions. Hard refusals are those who either delete the email without clicking anything or click ‘No’ but fail to answer the non-response questions. For non-respondents, selecting ‘No,’ logs their desire to not participate in the survey, and they will be removed from follow-up reminders.

The following introductory text and 3 questions comprise the non-response survey:
This study seeks to understand educators’ attitudes toward National Park Service (NPS) Education Programs. During the initial email invitation, you indicated that you refused to complete the full survey. Would you be willing to answer three questions instead? It should take you no more than one minute.

1. Do you teach or have you taught 3rd, 4th, or 5th grade in the past five years? (check all that apply)

    • 3rd grade

    • 4th grade

    • 5th grade

    • Have not taught 3rd to 5th grade in the past five years


2. At what type of school do you currently teach? (Choose one)

    • Public

    • Private

    • Home School

3. Is your school considered Title 1?

    • Yes

    • No


The information collected in the non-response survey will be used to perform the non-response bias analysis. Specifically, respondents will be compared to non-respondents on their answers to the questions in the survey that correspond to the non-response bias questions. Any implications and results of the non-response bias analysis will be discussed in the final report for this survey.

4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.

The majority of the questions included in these survey instruments were taken from the NPS Pool of Known Questions (OMB Control #:1024-0224; expiration date: 8/31/2026; Programmatic Clearance Process for NPS-Sponsored Public Surveys). Further, two subject matter experts in education and interpretation research were part of the development of the methodology and were asked to provide feedback on the overall study methodology, survey design, validity of question content, instructional text, and possible analytical needs. They provided suggestions on question order, logic and skip patterns, question phrasing, and methodological considerations.

Pretesting of the survey was conducted using a convenience sample of 9 University of Montana students. These individuals completed the survey online using a tablet and the Qualtrics software and provided general feedback surrounding survey clarity and design, as well as specific feedback outlined in Supporting Statement Part A #8. Based on this pretest, the burden was estimated to be 10 minutes.



5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

Statistical Consultant: Marc Stern – Virginia Tech; Bob Powell – Clemson University


Collection and analysis agency:

Jeremy Sage – RRC Associates

Jennifer Thomsen – University of Montana

Elena Bigart – University of Montana

National Park Service Social Science Program



Literature Cited

Dillman, D.A., Smyth, J. D., & Christian, L.M. 2014. Internet, Phone, Mail and Mixed-Mode Surveys: The Tailored Design Method (Fourth edition.). Wiley.

Menon, V. and Muraleedharan, A., 2020. Internet-based surveys: relevance, methodological considerations and troubleshooting strategies. General psychiatry, 33(5).


National Park Service IRMA, 2023. Retrieved from https://irma.nps.gov/Stats/.

1 Sample Size Calculator | Good Calculators (See: https://goodcalculators.com/sample-size-calculator/)

7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJeremy Sage
File Modified0000-00-00
File Created2024-08-15

© 2025 OMB.report | Privacy Policy