0248-generic-testing-Part A - Revised - Apr 2019

0248-generic-testing-Part A - Revised - Apr 2019.docx

Generic Clearance of Survey Improvement Projects

OMB: 0535-0248

Document [docx]
Download: docx | pdf

Supporting Statement – Part A


REQUEST FOR GENERIC CLEARANCE OF SURVEY IMPROVEMENT PROJECTS

from the

NATIONAL AGRICULTURAL STATISTICS SERVICE (NASS)


OMB No. 0535 – 0248


The National Agricultural Statistics Service (NASS) of the United States Department of Agriculture (USDA) requests renewal from the Office of Management and Budget (OMB) for generic clearance that will allow NASS to rigorously develop, test, and evaluate its survey instruments and methodologies. The primary objectives of the National Agricultural Statistics Service are to prepare and issue State and National estimates of crop production, livestock production, economic statistics, and environmental statistics related to agriculture and also to conduct the Census of Agriculture. This request is part of an on-going initiative to improve NASS surveys as recommended by both its own guidelines and those of OMB1.


State-of-the art techniques have been increasingly instituted by NASS and other Federal agencies, and are now routinely used to improve the quality and timeliness of survey data and analyses, while simultaneously reducing respondents’ cognitive workload and burden. The purpose of this generic clearance is to allow NASS to continue to evaluate, adopt and use these state-of-the-art techniques to improve its current data collections on agriculture. This clearance will also be used to aid in the development of new surveys. Additionally, NASS anticipates the benefit of increased response rates through improved survey design; a goal tied directly to improving response rates and reducing non-response bias.


Prior to each survey improvement project, NASS will provide OMB with a copy of the questionnaire (if one is used), and all other materials describing the project. NASS will also describe the research question or goal of each survey improvement project, the method proposed to answer the question or address the proposed goal, the recruitment of participants, collection of information, and proposed analysis.


NASS envisions using a number of survey improvement techniques, as appropriate to the individual project under investigation. These include: focus groups; cognitive interviews, usability studies, and other field techniques; exploratory interviews; behavior coding; respondent debriefing; and split-panel tests.


  1. Focus Group Methodology is a qualitative method that brings together a small number of subjects to discuss pre-identified topics. A protocol containing questions or topics focused on a particular issue or issues is used to guide these sessions. The session is administered by a trained moderator. Focus groups are useful for exploring and bringing to the surface potential issues that may concern either respondents or stakeholders. Focus groups are a good choice during the development of a survey or survey topic, when a pre-existing questionnaire or survey questions on the topic do not yet exist. Focus groups may also be used to explore respondents’ general opinions about data collection technologies or survey materials other than questionnaires.


  1. Cognitive interviews, Usability studies, and other Field Techniques are other qualitative methods that refer to a set of tools employed to study and identify errors that are introduced during the survey process. These techniques are generally conducted one-on-one with respondents. Cognitive interviews are generally used to clarify the question-response process, whereas usability studies are generally used to understand the physical features of a survey and supporting materials, for instance, its display and navigational features. Interviews may be conducted with respondents providing concurrent verbal protocols as they think aloud while answering survey questions or with retrospective protocols with information provided afterwards. Other techniques described in survey research and survey methodology literature will be employed as appropriate. These include follow-up probing, memory cue tasks, paraphrasing, confidence rating, response latency measurements, free and dimensional sort classification tasks, and vignette classifications. The objective of all of these techniques is to aid in the development of surveys that work with respondents’ thought processes, thus reducing response error and burden. These techniques have also proven useful for studying and revising pre-existing questionnaires.


  1. Exploratory Interviews may be conducted with groups of individuals to understand a topical area. For the most part, this will be used in the very early stages of developing a new survey. It may cover discussions related to administrative records, subject matter, definitions, etc. Exploratory interviews may also be used in exploring whether there are sufficient issues related to an existing data collection instrument to consider a redesign.


  1. Behavior Coding is a quantitative technique in which a standard set of codes is systematically applied to respondent / interviewer interactions in interviewer-administered surveys or respondent / questionnaire interactions in self-administered surveys. The advantage of this technique is that it can identify and quantify problems with the wording or ordering of questions, but the disadvantage is that it does not necessarily illuminate the underlying causes.


  1. Respondent Debriefing is a quantitative technique in which the actual survey under investigation is augmented by a second set of questions. The purposes of these questions are to determine whether the original survey questions are understood as intended, to learn about respondents’ form filling behavior and record keeping systems, and to elicit respondents’ satisfaction with the survey. This information can then be used (especially if it is triangulated with other information) to aid in improving the survey.


  1. Split Panel Tests refer to controlled experimental testing of alternative hypotheses. Thus, they allow one to choose from among competing questions, questionnaires, definitions, error messages or survey improvement methodologies with greater confidence than any of the other methods. Split panel tests conducted during the fielding of the survey are superior in that they can support both internal validity (controlled comparisons of the variable(s) under investigation) and external validity (represent the population under study). Most of the previously mentioned survey improvement methods can be strengthened when teamed with this method.


SECTION A. JUSTIFICATION

1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


The primary function of the National Agricultural Statistics Service is to prepare and issue State and National estimates which include crop and livestock production, economic and environmental inputs, whole farm characteristics and operator demographics (covered by the Census of Agriculture) under the general authority of Title 7 U.S.C. Sec. 2204.


NASS is requesting the renewal of a generic clearance in order to respond quickly to emerging issues and data collection needs. The agricultural economy continues to change and NASS needs to continuously evaluate its surveys in light of these changes. Respondents continue to change (e.g., response rates decrease over time), technology continues to change (e.g., the Web quickly became a data collection option), and data needs continue to change. In addition, our understanding of how to improve surveys continues to evolve (e.g., the application of cognitive psychology to survey methodology has increased our understanding of surveys). The generic clearance structure allows NASS to meet these information needs using a means that minimizes respondent and administrative burden. Thus, NASS requests an ongoing OMB clearance structure in place to continue to improve the overall quality of its statistical surveys, to lessen the burden it places on respondents, and to shorten the time period between changes that affect surveys and NASS’ ability to formulate and update its surveys to address those changes.


Prior to each test, NASS will submit to OMB a mini-supporting statement that will describe the details of each specific test, along with a sample of the questions or questionnaire that will be tested.


  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The information obtained from these efforts will be used to develop new NASS surveys and improve current ones. Specifically, the information will be used to reduce respondent burden while simultaneously improving the quality of the data collected in these surveys. These objectives are met when respondents are presented with plain, coherent and unambiguous questionnaires that ask for data compatible with respondents’ memory and/or current reporting and record keeping practices. The purpose of the survey improvement projects will be to ensure that NASS surveys continuously attempt to meet these standards of excellence.


Improved NASS surveys will inform policy decisions on agriculture, as well as contributing to increased agency efficiency and reduced survey costs. In addition, methodological findings have broader implications for survey research and may be presented in technical papers at conferences or published in the proceedings of conferences or in journals.


The results of these tests won’t be disseminated or used to inform policy, program, or budget decisions.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g. permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


NASS will employ information technology, as appropriate, to reduce the burden on respondents who agree to participate in its survey improvement projects. NASS may also explore the use of state-of-the-art technology (e.g., satellite TV or the Web) to conduct focus groups and other appropriate uses of technology, as yet unknown, to reduce burden on respondents. Web surveys have the potential to facilitate respondents’ data entry by performing automated tabulations and by providing feedback regarding errors in the reported data. These features potentially reduce the need for follow-up contact with respondents. However, the success of these features is dependent upon their design. Thus, one of the major motivations behind NASS’ request for generic clearance approval is to ensure that respondents to its Web surveys are presented with the most understandable and least burdensome instruments possible. In addition, NASS intends to explore the adoption of even more innovative methods to reduce respondent burden.


4. Describe efforts to identify duplication.


Generally, survey improvement projects will be undertaken to answer questions that have not yet been addressed or answered in the literature (this would include the circumstances in which NASS needs to ensure that questions or technologies that performed well for another survey work equally as well in a NASS survey). Also, although business surveys are discussed in the survey literature, the target population of farmers is not usually addressed in the business survey literature.


5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.

One of the major goals behind NASS’ push to improve its surveys and survey processes is to reduce the burden that farmers experience when they respond to NASS surveys. Thus, most survey improvement efforts may require a minimal amount of additional up front burden with the intent of producing much larger reductions of burden in future NASS surveys.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


There are consequences to NASS not being able to conduct the survey improvement projects requested in this document. The quality of the data collected in current surveys may decrease, questionnaires and questions that become irrelevant due to changes in the farming industry will need to be updated. Over time questionnaires that are well-designed and understood now may become obsolete and need to be updated. Without adequate testing, data collected may be of poor quality, resulting in additional resources required to process data or negative impacts on survey estimates. New items will be implemented without adequate testing and refinement. Inroads into our understanding of how agricultural producers answer surveys and how NASS can better serve them will slow, and NASS’ ability to develop timely new well-designed surveys will be diminished.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner requiring respondents to report information to the agency more often than quarterly;


There are no special circumstances associated with this information collection.


  1. Provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8 (d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments


Comments on this data collection effort were solicited in the Federal Register, Vol. 83, Number 221, pages 57,400 – 57,401, published on November 15, 2018. NASS did not receive any public comments for this renewal request.


It is critical to the development or improvement process for any survey that NASS ensures that the proposed survey questions can be answered by the target population, and that the questions are asked such that they provide for the most uniform comprehension possible. Respondent involvement in questionnaire development serves to ensure that respondents understand and can answer the survey questions, thus reducing overall respondent burden and improving data quality.


9. Explain any decision to provide any payment or gift to respondents.


Generally, NASS will only compensate respondents for travel costs related to focus groups or other similar field testing activities. Focus group participants will be provided between $50 and $75 to cover travel costs and incidentals. Any higher amounts will be justified on a case by case basis. Periodically, non-monetary tokens of appreciation such as baseball caps, pens, and other items will be given to participants.


  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


Questionnaires include a statement that individual reports are kept confidential. U.S. Code Title 18, Section 1905 and U.S. Code Title 7, Section 2276 provide for the confidentiality of reported information. All employees of NASS and all enumerators hired and supervised under a cooperative agreement with the National Association of State Departments of Agriculture (NASDA) must read these regulations and sign a statement of compliance.


Additionally, NASS and NASS contractors comply with OMB Implementation Guidance, Implementation Guidance for Title V of the E-Government Act, Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA), (Public Law 107-347). CIPSEA supports NASS’ pledge of confidentiality to all respondents and facilitates the agency’s efforts to reduce burden by supporting statistical activities of collaborative agencies through designation of NASS agents, subject to the limitations and penalties described in CIPSEA.


The following confidentiality pledge statement will appear on all NASS questionnaires.


The information you provide will be used for statistical purposes only. Your responses will be kept confidential and any person who willfully discloses ANY identifiable information about you or your operation is subject to a jail term, a fine, or both. This survey is conducted in accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and other applicable Federal laws. For more information on how we protect your information please visit: https://www.nass.usda.gov/confidentiality.


11. Provide additional justification for any questions of a sensitive nature.


No questions of a sensitive nature are anticipated in work conducted under this generic clearance.


12. Provide estimates of the hour burden of the collection of information. The statement should indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated.


NASS estimates that total burden will not exceed 30,000 hours for all testing of survey instruments and methodologies during the next three years. Burden will be used to test both Sampled Surveys and the some of the Census of Agriculture follow-on surveys.

Normally, respondents are not contacted more than once in a calendar year for testing with any of the survey improvement techniques, therefore, the number of respondents equals the maximum number of contacts covered by this clearance request. Sampling in any of the projects related to this docket will be use to make methodological comparisons; not to generate substantive survey estimates.


Although many of these collections occur in alternating years, the total burden would not exceed the estimates provided in Table 1.


Table 1. Projected Total Response Burden for the next three years.


Number of Respondents

Average Hours per Respondent

Total Burden Hours

Sample Surveys and Census Follow-on Surveys

25,000

0.6

15,000


Examples of sample surveys that could be tested under this clearance include the June Agricultural Survey, preliminary testing for the 2023 Census of Agriculture Survey, commodity surveys, stocks or prices surveys, the Agricultural Resource Management and Chemical Use Surveys (ARMS), the Cash Rent Survey, and some other NASS reimbursable surveys.


Total combined reporting cost to the public for all surveys is projected by multiplying 15,000 hours by $36.66 per hour to come up with an amount of $549,900.


NASS uses the Bureau of Labor Statistics’ Occupational Employment Statistics (most recently published on March 30, 2018 for the previous May) to estimate an hourly wage for the burden cost. The May 2017 mean wage for bookkeepers was $19.76. The mean wage for farm managers was $38.62. The mean wage for farm supervisors was $24.11. The mean wage of the three is $27.50. To calculate the fully loaded wage rate (includes allowances for Social Security, insurance, etc.) NASS will be adding an additional 33% for a total of $36.66 per hour.


13. Provide an estimate of the total annual cost burden to respondents or recordkeepers resulting from the collection of information.


There are no capital/startup or ongoing operation/maintenance costs associated with this information collection period.


14. Provide estimates of annualized cost to the Federal government; provide a description of the method used to estimate cost which should include quantification of hours, operational expenses (equipment, overhead, printing, and staff), and any other expense that would not have been incurred without this collection of information.


The annual cost to the Federal government generated by the survey improvement projects is estimated to be approximately $1,000,000. The main components of these costs are staff time and travel expenses. There are no start-up, equipment, operation or maintenance costs.


15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I (reasons for changes in burden).


The respondent burden and number of responses have been kept the same as in the previous approval cycle.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Data will be collected to develop new surveys or improve the methodology of current surveys. Methodological findings may be published in the technical notes sections of the reports of survey data, in separate reports, in technical papers presented at conferences, in the proceedings of conferences, or in journals.


No substantive estimates will be published from these studies as findings will only be used for methodological comparisons.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


No approval is requested for non-display of the expiration date.


18. Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions” of OMB Form 83-I.


No exceptions to the Certification Statement should be required. If so, OMB approval will be requested in advance of conducting the survey.




February 2019

Revised April 2019

1 NASS Information Quality Guidelines are available on http://www.usda.gov/nass/nassinfo/infoguide.htm. OMB Information Quality Guidelines are available on http://www.whitehouse.gov/omb/inforeg/infopoltech.html.

7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleREQUEST FOR GENERIC CLEARANCE OF
Authormvburke
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy