1018-New SSB FWS Programmatic 01312024

1018-New SSB FWS Programmatic 01312024.docx

Programmatic Clearance for U.S. Fish and Wildlife Service Social Science Research

OMB:

Document [docx]
Download: docx | pdf


Supporting Statement B

for paperwork reduction act submission


Programmatic Clearance for

U.S. Fish and Wildlife Service Social Science Research

OMB Control Number 1018-New



Collections of Information Employing Statistical Methods


The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When the question “Does this ICR contain surveys, censuses, or employ statistical methods?” is checked "Yes," the following documentation should be included in Supporting Statement B to the extent that it applies to the methods proposed:


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The potential respondent universe includes refuge visitors, potential visitors, on- and off-site program participants, including ‘virtual visitors’ who access content from a Fish and Wildlife Service (Service) website, local community members, educators, Government officials, landowners, partners and other stakeholders, volunteers, and tribal interests. All ICRs submitted will include: a description of a survey’s particular respondent universe; whether sampling methods will be used and how sampling strategies will be employed; descriptions of the sampling units; justifications for any stratified, quota, or convenience sampling methods; methods for soliciting participants; and expected response rates (referencing prior collections, if applicable).


We estimate that there will be approximately 20,333 onsite/mail/internet survey respondents, 784 non-response survey respondents, and 892 respondents participating by other means (e.g., telephone, focus groups, interviews). Based on results of previous information collections for the National Wildlife Refuge Visitor Survey (Visitor Survey), it is anticipated that response rates will be at or above levels needed to obtain statistically viable results. Guidance may be provided to Principal Investigators (PIs) on methods for maximizing response rates.


2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The Service’s Human Dimensions (HD) Branch expects a variety of methodologies to be submitted under this clearance, including, but not limited to Intercept surveys with on-site (paper or electronic) and in-home (mail and/ or web) completion options, focus groups, individual interviews, mail and web surveys, and telephone surveys. All collections under this clearance must fully describe the survey methodology. The description must be specific and describe each of the following:


  • respondent universe,

  • sampling methods and procedures (including discussion of sample size, sample unit, stratification or quota procedures, respondent selection, potential data problem, and cyclic collection plans),

  • how the instrument will be administered,

  • expected response rate and confidence intervals,

  • strategies for dealing with potential non-response bias,

  • any pre-testing and peer review data that have been collected,

  • an estimate of the respondent burden,

  • planned analysis, including identification of key variables, proposed statistical tests and statistical software, and

  • an information dissemination plan, with tentative timeline and outlets for release of data and/or results, including feedback for program managers.


All submissions under this program will be carefully evaluated to ensure consistency with the intent, requirements and boundaries of this programmatic clearance. Criteria for program evaluations that involve pre- and post- test collections (non-probability samples) will be subject to the same scrutiny.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Surveys will incorporate best practices to maximize initial and overall response rates. ICRs will detail strategies to maximize response rates, including, but not limited to: using the Tailored Design Method (Dillman, Smith & Christian, 2014) for multiple contacts with mail and internet surveys, call-back protocols for telephone surveys, multiple response options (e.g., mail, internet or web), interviewer training including respondent conversion techniques, minimizing completion burden, the use of token incentives, etc.


For surveys designed to infer from a sample to a population, specific strategies for reducing, detecting and analyzing non-response bias will be required. These may involve methods that use survey logs to record observable characteristics of all initial on-site contacts or by asking a small subset of key substantive and demographic questions in a mail-back survey to serve as the non-response bias check. PIs will clearly state how they will statistically test for and address potential non-response bias.


The results of non-response bias analyses would be included in any reports or documents discussing the results of the collection. The likely effects of any bias on the interpretation of data and implications must be clearly described. If necessary, descriptions of standard practices for weighting data should be included.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Peer review and pre-testing of methods, procedures, and data-collection instruments are required as a means to reduce respondent burden and maximize the usefulness and validity of surveys. As surveys, methods and questions acquire a record of successful application in the field, pre-testing requirements may be indicated only for new methods or questions in an otherwise standard collection. For ICRs with new methods or untried questions, the PI will be required to provide documentation of a question’s previous use (from peer-reviewed literature) and/or the results of pre-testing on 9 or fewer respondents. However, a request for an extensive pretest for a survey may be submitted as part of the ICR. A peer review might include comments on the overall structure, sequence, and clarity of questions, and might suggest an estimated completion time for the calculation of burden estimates. The discussion should include notes on respondent comprehension, identification of sources of measurement error, changes made to questions or methods based on review/tester feedback and whether hour burden adjustments were needed.


5. Provide the names and telephone numbers of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The names and contact information of the responsible PIs who will collect and analyze the data will be included on all submission forms received under this programmatic clearance.


An HD Branch social scientist (information collection request coordinator) will review requests to ensure the survey instrument, methodology and analysis plan are designed in such a way as to provide scientifically and statistically valid data. If additional peer review is needed or if the ICR originates from within the HD Branch, the ICR Coordinator will assign and coordinate that review. The ICRs will include the names and contact information of persons consulted in the specific information collection requests submitted under this programmatic clearance.


Table 1: Individuals consulted on statistical aspects of the design of the Programmatic Approval Process


Name

Affiliation

Contact Information

David Fulton

Adjunct Professor

Dept. of Fisheries, Wildlife and Conservation Biology

University of Minnesota

[email protected]

(612) 625-5256

Peter Fix

Natural Resources Management Department Chair

University of Alaska Fairbanks

[email protected]

(907) 474-6926


- 3 -


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMurphy, Amelia A
File Modified0000-00-00
File Created2024-11-18

© 2025 OMB.report | Privacy Policy