PI-CASC - Supporting Statement B - 10.04.24

PI-CASC - Supporting Statement B - 10.04.24.docx

The Impact and Potential of “Co-Production” in Addressing Climate Adaptation across the Pacific Islands

OMB:

Document [docx]
Download: docx | pdf

Supporting Statement B

for paperwork reduction act submission


The Impact and Potential of “Co-Production” in Addressing Climate Adaptation across the Pacific Islands

OMB Control Number 1028-New


Collections of Information Employing Statistical Methods


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


Project A:


For surveys, we will use a census sampling method. All project principal investigators, co-investigators, cooperators/partners with valid email addresses will be considered as the entire universe for this sample. All these participants will be reached out to and given the same opportunity to participate. Surveys will be distributed electronically via an online survey website in a fillable format, in order to reduce the burden on participants. Respondents will be sent invitations to the survey through email. Participants may also request paper surveys if preferred. Approximately half of the requested survey participants are expected to respond, with most responses anticipated to be completed electronically.


For interviews, we will intentionally focus on sampling a subset of projects of interest for their collaboration and co-production approaches. A “snowball” sampling method may also highlight additional participants to reach out to. Interviews will be conducted in-person, or over video or voice calling platforms where necessary to reduce burden. For interviews, all project principal investigators, co-investigators, and cooperators/partners from these subset case-study projects will be considered as the entire universe for that section of the study. All of these participants with valid email addresses will be reached out to and given the same opportunity to participate.


The universe for this collection will include a combined list of 330 individuals. The survey respondents will be used to represent the broader PI-CASC funded projects’ principal investigators, co-investigators, and collaborators; graduate scholars and postdocs; and community members. Interview participants are expected to overlap with the pool of participants that are also asked to complete the research surveys. We acknowledge that the number of participants reached out to for interviews will represent a curated subset of projects, and thus a more narrow universe for this collection, and will thus be limited in representing perspectives more than the census survey approach. There will be no attempts to generalize the results of this research outside of the scope of this study and this universe of respondents.


Project Evaluaton Participants: PI-CASC Project Explorer pages, reports, and word-of-mouth snowball sampling recommendations have been used to create our list of all potential survey respondents from the broader project portfolio (n = 330). A subset (n = 50) of these individuals from key projects of interest will make up the sample of interview respondents.


Respondent universe and expected sample size


Survey Sample

Respondent Universe

Response rate

Expected number of responses


Types

(1) Federal employees

(2) State or local government employees

(3) Academic researchers

(4) NGO leaders

(5) International

(6) Students



85

25

115

30

30

45


50%

50%

50%

50%

50%

50%



42

13

57

15

15

23

Total:

330





165

165



Interview Sample

Respondent Universe

Response rate

Expected number of responses

Type

(1) Federal employees

(2) State or local government employees

(3) Academic researchers

(4) NGO leaders

(5) International respondents

(6) Students



12

4

17

6

5

6



100%

100%

100%

100%

100%

100%


12

4

17

6

4

6


Total:

50


50


Project B:


For surveys, we will utilize descriptive research techniques with online survey methods. All members of the Pacific RISCC, HRPRG, Ecosystems, and Predator-Control Hui Listservs will be included along with attendees of Pacific RISCC training, workshops, and forums. In addition, members of the Micronesia RISC group and Invasive Species Council members from Hawaii and the USAPI region will be included. Collectively, these individuals will represent the entire universe for this section of the survey. Surveys will be distributed electronically via an online survey website in a fillable format, and respondents will be sent invitations to the survey through email. Participants may also request paper surveys if preferred. We expect that about 50% of the invitees will respond, with 100% of responses happening electronically. The direct survey responses will not be made public, but synthesized reports with summaries are expected to be released publicly.


For interviews, we will focus on a subset of resource managers from Hawaii or across the USAPI region who have shown interest in Pacific RISCC activities. A snowball sampling method may be used to find additional interviewees to reach out to. Interviews will be conducted in-person, or over video or voice calling platforms where necessary to reduce burden. A total of 15 interviewees will be selected for hour-long interviews and the group of 15 will be determined by the Pacific RISCC Leadership team and will represent the entire universe for this section of the survey. Interviews will be conducted with a semi-structured approach to allow for some flexibility. All of the participants with valid email addresses will be reached out to and given the same opportunity to participate.


The universe for this survey will include multiple UH listservs (~750 members), the Micronesia RISC, and Island Invasive Species Councils in the U.S. Pacific, for a total of about 800 individuals). These respondents will be used to represent a snapshot of the needs, interests, and concerns of resource managers from across the U.S. Pacific region to help inform the goals and objectives of the Pacific RISCC. Interview participants are likely to overlap with the pool of participants that are asked to complete the surveys and are not likely to be representative of the entire universe.


This type of collection has not been conducted previously. Because of this, it is not clear what the exact response rate will be. However, a similar survey conducted by Pacific RISCC in 2019 and 2020, which was “distributed online through Hawaii Listservs and partners” (likely in the hundreds of people), had 59 respondents.


Because it is unclear who within the network will respond to the survey, particularly within the Micronesia Regional Invasive Species Council (RISC) group, the table below is based on the approximate make-up of the Pacific RISCC Listserv, which had 522 subscribers as of early September, 2024. The numbers are scaled up slightly to account for the additional listservs and groups that will be included in the surveys. In addition, the list of individuals who will be targeted for the interviews will be selected by the Pacific RISCC Leadership team, but we do not expect they will all be able to participate, so we are expecting a 25% response rate even though invitations will be individually tailored when they go out.


Respondent universe and expected sample size


Survey Sample

Respondent Universe

Response rate

Expected number of responses


Types

(1) Federal employees

(2) State, local, or Territorial government employees

(3) International (Palau, RMI, FSM)

(4) NGO leaders

(5) University (Students/Staff/Faculty)

(6) Community members



120

220

60

100

200

100


10%

10%

10%

10%

10%

10%



12

22

6

10

20

10

Total:

800





80



Interview Sample

Respondent Universe

Response rate

Expected number of responses

Type

(1) Federal employees

(2) State or local government employees

(3) Territorial government employees


20

20

20






25%

25%

25%



5

5

5


Total:

60


15



2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


Project A:

Based on our estimations, we conservatively anticipate a response rate of 50% for the project portfolio survey portion. No unusual problems requiring specialized sampling procedures exist and the estimation procedure and degree of accuracy needed may not apply. No complex statistical analysis will be done aside from collecting summaries, counts and averages for comparison. The purpose is to better understand trends in participants’ perceptions of the collaborative process, project outcomes, and impacts. We also aim to interview an estimated 50 respondents for the interview portion, wherein case study projects will be assessed qualitatively.


Project B:


No complex statistical analysis will be done on the results, but we will be doing general descriptive and summary statistics (e.g., mean, median, mode, variance, minimum and maximum range, and quartiles, and calculating and comparing percentage responses to various questions) as well as some very basic statistical comparisons potentially including logistic regression. Comparisons may be made between regions (Hawaii vs. USAPI region), self-identified groups of managers and researchers, and between sectors (e.g., government agencies vs. non-profits, and/or public vs. private land management groups). Basic statistics that may be used include chi-square tests, t-tests, and histograms. Samples will be drawn from, and will include the results from all respondents who answered the questions.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Project A:


Respondents will be sent invitations to participate in the survey through email. Participants may also request paper surveys if preferred in order to increase accessibility where appropriate. In order to maximize response rate, survey reminder emails will be sent out once per week twice after the initial request, with each reminder email going out only to those who have not yet responded. Survey questions have also been designed to be simple and brief in order to make completion easier and improve response rate. There will be “Not Applicable” and “I don’t know” options as relevant to reduce non-response error associated with statements not applying to an individual. We will not assume a zero value for individuals that do not participate in the survey and will track the total number of surveys requested that are not completed versus completed.


Interviews will be requested through email and conducted in-person or over video or voice calling platforms. We hope to foster higher response rates through our past good working relationships with project leads, and prior communications and interest in their work and continued collaboration. Where possible, project leads will help emphasize this work’s importance and relevance. In person interviews or phone interviews may help increase response rate if respondents lack access to technology or lack the bandwidth needed to complete electronic calls. Interviews will also be conducted with a semi-structured approach to allow for flexibility in exploring research topics specific to each project and individual’s experience. We will track whether any potential interviewees decline our request, and, if shared, make note of the explanations for why they are unable to participate to review the totality of any evidence of non-response bias.


Project B:


To maximize response rates and deal with non-response, we will 1) send the survey to both group emails as well as to individuals in leadership roles at resource management agencies, 2) send reminder emails, 3) send invitations from different Pacific RISCC leadership team members over time to maximize network reach (e.g., an initial email from one member and a follow-up from another), 4) have an adequate amount of time for the survey to be open to maximize responses, esp. for folks who are on leave during a part of the open survey time, and 5) we will reach out to individuals in leadership roles via multiple means if needed (email to start with followed by text/WhatsApp, phone calls, or other digital message (DM) systems.


In order to minimize the possibility of non-response bias, we will collect very basic information (e.g., profession, years working in profession, sector, role, etc.), and we will have a very short “opt-out” option for folks who do not have time to fill out the survey. We will then be able to compare the characteristics between the opt-out (short version), and the full version of the survey to see if and how they differ. As well, a limited number of phone interviews will be offered in lieu of the online survey where comparisons can be made to see if the groups differ. If differences are found between the groups (e.g., through chi-square, t-tests, or logistic regression), we will seek to find out reasons for the differences in the literature, or in the follow-up interviews.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Project A:


We consulted with colleagues who have had experience drafting, administering, and analyzing similar surveys and interviews for other CASC programs. Prior to administering the survey and interviews to our potential universe, they will also be administered to a pilot group, which will include 2-3 of each of the following types of participants: project leads and collaborators; resource managers and end users, and students (totaling less than 10 respondents). This testing will help us further understand the appropriate time needed to complete interviews and surveys, whether the questions have issues related to clarity, redundancy, or flow, or if any other challenges arise.




Project B:


Pre-tests are not planned for these surveys as they would likely add to the participation burden. The past 2021 Pacific RISCC survey, along with surveys from the other RISCC groups (Northwest, Northeast, North-Central, and South-east) will be used to create the best most efficient survey questions, and members of the Pacific RISCC leadership team as well as the Cross-RISCC team will be asked to review the questions to ensure they are adequate and properly worded. We believe these past surveys, which are mostly from other regions but on similar topics, are an adequate foundation for our surveys in lieu of further tests.


5. Provide the names and telephone numbers of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Project A:


Consultation about the design of this project, including its survey and interview questions and approach, included conversations and feedback from Amanda Cravens, Alison Meadow, and Aparna Bamzai Dodson.


Amanda Cravens

Research Social Scientist

Forest and Rangeland Ecosystem Science Center/Northwest CASC

Phone: (541) 243-2042

Email: [email protected]


Alison Meadow

Associate Research Professor

University of Arizona/Southwest CASC

Phone: (520) 626-0652

Email: [email protected]


Aparna Bamzai-Dodsom

Assistant Regional Administrator

North Central CASC

Phone: (970) 889-1231

Email: [email protected]


The surveys and interviews will be administered and analyzed by PI-CASC team members, including Lindsey Ellett, an ORISE Fellow working with PI-CASC.


Lindsey Ellett

Co-Produced Climate Adaptation Science Impacts Fellow

ORISE, with PI-CASC

Email: [email protected]


Project B:


The Pacific RISCC Core Team is being consulted on the content and design of the surveys. This includes Chelsea Arnott, Andrea Blas, Laura Brewington, Jeff Burgett, Glenn Dulla, Bradley Eichelberger, Jacques Idechong, Heather Kerkering, Christy Martin, MJ Mazurek, Roland Quitugua, and Denis D.J. Sene Jr. as well as the Pacific RISCC Science Team Casidhe Mahuka, Trey Dunn, Helen Sofaer, Daffodil Silver Wase, & Sasha Kyota.


The surveys and interviews will be administered and analyzed by Dr. Elliott Parsons, Pacific RISCC Specialist with the Pacific Islands Climate Adaptation Science Center at the University of Hawaiʻi at Mānoa ([email protected]), 808-494-0832. Dr. Parsons will also design any descriptive statistics or comparisons if needed for the project.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2025-07-02

© 2025 OMB.report | Privacy Policy