NAV Part B Final_9-21-2015 Reviewed

NAV Part B Final_9-21-2015 Reviewed.docx

Instrument(s) for Navigation Improvement Survey(s)

OMB: 0710-0018

Document [docx]
Download: docx | pdf

Supporting Statement B


Instrument(s) for Navigation Improvement Survey(s)

OMB Control Number XXXX-XXXX



Collections of Information Employing Statistical Methods


The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When the question “Does this ICR contain surveys, censuses, or employ statistical methods?” is checked "Yes," the following documentation should be included in Supporting Statement B to the extent that it applies to the methods proposed:


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The potential respondent universe varies by survey. Grain shippers include over 4000 elevators. Carriers on the inland waterway number about 2000. Surveys for coastal ports will typically be limited to the firms calling at the port, were the top 20 carriers will handle well over 95 percent of the cargo. Commercial fishery surveys are done at the vessel level and for an average harbor study will number 100. All study proposals must include a description of a survey’s particular respondent universe.


Response rate vary by survey type. Efforts are made to work with industry organization to announce the upcoming survey and to encourage participation. Surveys are done to support planning studies. The number of study supporting surveys can vary from none to perhaps 4. Based on experience with the existing Programmatic Clearance (0710-0001), we estimate a maximum of nine surveys will be conducted annually. These will include telephone, face-to-face, mail and internet methods to ensure appropriate response rates. Respondent types will primarily include shippers, carriers, businesses that receive commodities by water, port facilities and commercial fishermen.


The number of entities in the universe covered by the survey, and in any sample of that universe (including each stratum of samples which are stratified) will be provided in tabular form. Expected response rates for survey data collection will be indicated, based on past results of similar surveys and other factors such as the length and complexity of the survey questionnaire.



2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


All submissions will be carefully evaluated to ensure consistency with the intent, requirements, and boundaries of this programmatic clearance. Proposed collection instruments and procedures must comply with OMB guidance in “Guidance on Agency Survey and Statistical Information Collections (January 20, 2006).” The sampling methods and reporting statistical data must include a specific description of:


  • the sampling plan and sampling procedure (including stratification and selection methods for individual respondents);

  • how the instrument will be administered to respondents;

  • the planned analysis; and

  • desired confidence intervals and estimation procedures.


Districts submitting information collection requests under this programmatic clearance process are strongly encouraged to pretest any information collection instruments to be used. Pretests will normally emphasize intensive debriefing of less than 10 respondents, to identify questionnaire problems. Further, we will strongly encourage use of the programmatic clearance to obtain approval to conduct any pretesting that falls under the requirements of the Paperwork Reduction Act (i.e., more than nine individuals are being surveyed, etc.). This will normally be required to pilot test survey implementation procedures. In these cases, requests for approval to pretest surveys will be subject to the same requirements (i.e., a supporting statement, copy of the instrument, etc.) as a standard information collection.


The Corps of Engineers Centers of Expertise for inland and coastal navigation will conduct an administrative review of each request and oversee technical reviews of each request to ensure statistical validity and soundness. All information collection instruments will be designed and deployed based upon acceptable statistical practices and sampling methodologies, where appropriate, and will be used to obtain consistent, valid data that are representative of the target populations, account for non-response bias, and achieve response rates and sample sizes at or above levels needed to obtain statistically useful results.


All submissions under the program of expedited approval must fully describe the survey methodology. The description must be specific and describe, as appropriate, each of the following: (a) respondent universe, (b) the sampling plan and all sampling procedures, including how individual respondents will be selected, (When appropriate, the sampling plan shall require either a simple random sample or a systematic sample with a random starting point. For systematic samples, the sampling interval shall be determined based on the desired sample size.) (c) how the instrument will be administered, (d) desired response rate and confidence, and (e) strategies for dealing with potential non-response bias. A description of any pre-testing and peer review of the methods and/or instrument is highly recommended. Further, all submissions under this clearance process will describe how data will be presented to managers and any others who will use results of the surveys, particularly in cases where response rates are lower than anticipated. In these cases, program managers must take steps to ensure that the results will not be generalized outside the population of interest and explanations are provided with data presentations and reports so that users of the data understand any possible biases associated with the data.


In its technical and administrative review, the Corps will work with researchers to ensure that information-collection procedures are appropriate for the intended uses of the data, including selection of the appropriate unit of analysis.


Most surveys are conducted in response to a special study need and are, therefore, one-time requests, not requiring annual reporting.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


For surveys designed to infer from a sample to a population, the Corps requires that proposals address issues of potential non-response. Surveys must incorporate best practices to maximize initial response rates (i.e., multiple follow-ups or call-backs, minimal hour burden). Further, specific strategies for detecting and analyzing non-response bias are to be included in the submission form accompanying survey instruments. These may involve the use of publicly available respondent data and the use of abbreviated questionnaire form to compare non-response and partial response results to full survey results. Investigators conducting telephone surveys may use their most experienced interviewers to convert “soft refusals” to completed interviews in order to maximize response rates.


The Corps requires that the results of non-response bias analyses be included in technical reports, and that the likely effects of this bias (if any) on the interpretation of data must be made clear to managers. In some cases, it may be feasible to balance or post-weight a sample to align important sample statistics with known population parameters e.g., publicly available user characteristics or zip code characteristics. However, this does not guarantee that there will not be non-response bias in attitude, knowledge, or belief variables.


Response rates are maximized through careful consideration to detail in the development, not only of the survey questionnaire, but also of the entire survey implementation process. For example, for mail questionnaires, Dillman’s “Total Design Method” is typically used including: 1) “multi-wave” mailings of the original questionnaire, postcard reminders, and follow-up mailings to those still not responding; 2) authoritative and informative and persuasive cover letters; and 3) carefully worded and formatted questionnaires.


Based on our experience with the existing Programmatic Clearance, we anticipate response rates at or above levels needed to obtain statistically viable results. Response rates for navigation surveys have typically ranged from 15 to 35 percent for shippers and carriers. Commercial fishing responses are usually around 40 percent. Response rates are boosted by repeat visits in order to contact respondents for face-to-face interviews during alternative weekdays and weekends and evening hours as well as daytime hours.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Before surveys are conducted the questionnaires and the survey process are carefully reviewed and pretested for simplicity and relevance. Pretests are primarily done on groups of less than ten respondents. Training for interviewers is usually held prior to the implementation of the survey and typically include role-playing in an actual field setting under supervised conditions. Once the interview process is begun, field supervisors periodically de-brief interviewers to identify any problems encountered including any unnecessary burdens being placed on respondents. Pretests are primarily done on groups of less than ten respondents.


Most all of the survey questions in this package have been used before in previous survey efforts and many cases modified based on the review from previous survey experience. Before new surveys are conducted the questionnaires and the survey process are carefully reviewed and pretested for simplicity and relevance. Training for interviewers is usually held prior to the implementation of the survey and typically include role-playing and an actual field survey under supervised conditions. Once the interview process is begun, field supervisors periodically de-brief interviewers to identify any problems encountered including any unnecessary burdens being placed on respondents. Pretests are primarily done on groups of less than ten respondents.


5. Provide the names and telephone numbers of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The commander of each Corps Division is ultimately responsible for approval of the sampling strategy, questionnaire, and analysis plan for surveys conducted in his or her division. Corps District staffs will consult with experts from local universities and/or contractors in developing specific survey and analytical plans. The surveys will typically be administered by Corps district staff or contractors under the supervision of Corps district staff. Past surveys and question have been designed by the following experts:


Wesley W. Wilson

Department of Economics

University of Oregon

Eugene, Oregon 97403


Kenneth E. Train

University of California
Department of Economics
530 Evans Hall #3880
Berkeley CA 94720-3880


Mark Burton

Research Associate Professor
Department of Economics
The University of Tennessee
507 Stokely Management Center
Knoxville, Tennessee 37996-0550


Ken Casavant

Professor
SFTA Principal Investigator
School of Economic Sciences
Washington State University
PO Box 646210
Pullman, WA 99164-6210

3

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement for Programmatic Clearance for NPS-sponsored Public Surveys
Authormmcbride
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy