Supporting Statement – Part A
REQUEST FOR GENERIC CLEARANCE OF SURVEY IMPROVEMENT PROJECTS
from the
NATIONAL AGRICULTURAL STATISTICS SERVICE (NASS)
OMB No. 0535 - 0248
The National Agricultural Statistics Service (NASS) of the United States Department of Agriculture (USDA) requests renewal from the Office of Management and Budget (OMB) for generic clearance that will allow NASS to rigorously develop, test, and evaluate its survey instruments and methodologies. The primary objectives of the National Agricultural Statistics Service are to prepare and issue State and National estimates of crop production, livestock production, economic statistics, and environmental statistics related to agriculture and also to conduct the Census of Agriculture. This request is part of an on-going initiative to improve NASS surveys as recommended by both its own guidelines and those of OMB1.
In the last decade, state-of-the art techniques have been increasingly instituted by NASS and other Federal agencies, and are now routinely used to improve the quality and timeliness of survey data and analyses, while simultaneously reducing respondents’ cognitive workload and burden. The purpose of this generic clearance is to allow NASS to continue to evaluate, adopt and use these state-of-the-art techniques to improve its current data collections on agriculture. This clearance will also be used to aid in the development of new surveys. Additionally, NASS anticipates the benefit of increased response rates through improved survey design; a goal tied directly to improving response rates and reducing non-response bias..
Prior to each survey improvement project, NASS will provide OMB with a copy of the questionnaire (if one is used), and all other materials describing the project. NASS will also describe the research question or goal of each survey improvement project, the method proposed to answer the question or address the proposed goal, the recruitment of participants, collection of information, and proposed analysis.
NASS envisions using a number of survey improvement techniques, as appropriate to the individual project under investigation. These include focus groups, cognitive and usability laboratory and field techniques, exploratory interviews, behavior coding, respondent debriefing, pilot surveys and split-panel tests.
Focus Group Methodology is a qualitative method that brings together a small number of subjects to discuss pre-identified topics. A protocol containing questions or topics focused on a particular issue or issues is used to guide these sessions. The session is administered by a trained moderator. Focus groups are useful for exploring and bringing to the surface potential issues that may concern either respondents or stakeholders. Focus groups are a good choice during the development of a survey or survey topic, when a pre-existing questionnaire or survey questions on the topic do not yet exist. Focus groups may also be used to explore respondent’s general opinions about data collection technologies or survey material other than questionnaires.
Cognitive and Usability Laboratory and Field Techniques are other qualitative methods that refer to a set of tools employed to study and identify errors that are introduced during the survey process. These techniques are generally conducted one-on-one with respondents. Cognitive techniques are generally used to clarify the question-response process, whereas usability is generally used to understand the physical features of a survey, for instance, its display and navigational features. In concurrent interviews, respondents are asked to think aloud as they actually answer the survey. In retrospective interviews, respondents answer the survey as they would normally, then ‘think aloud’ afterwards. Other techniques described in Survey Research and Survey Methodology literature will be employed as appropriate. These include follow-up probing, memory cue tasks, paraphrasing, confidence rating, response latency measurements, free and dimensional sort classification tasks, and vignette classifications. The objective of all of these techniques is to aid in the development of surveys that work with respondents’ thought processes, thus reducing response error and burden. These techniques have also proven useful for studying and revising pre-existing questionnaires.
Exploratory Interviews may be conducted with groups of individuals to understand a topical area. For the most part, this will be used in the very early stages of developing a new survey. It may cover discussions related to administrative records, subject matter, definitions, etc. Exploratory interviews may also be used in exploring whether there are sufficient issues related to an existing data collection instrument to consider a redesign.
Behavior Coding is a quantitative technique in which a standard set of codes is systematically applied to respondent / interviewer interactions in interviewer-administered surveys or respondent / questionnaire interactions in self-administered surveys. The advantage of this technique is that it can identify and quantify problems with the wording or ordering of questions, but the disadvantage is that it does not necessarily illuminate the underlying causes.
Respondent Debriefing is a quantitative technique in which the actual survey under investigation is augmented by a second set of questions. The purposes of these questions are to determine whether the original survey questions are understood as intended, to learn about respondents’ form filling behavior and record keeping systems, and to elicit respondents’ satisfaction with the survey. This information can then be used (especially if it is triangulated with other information) to aid in improving the survey.
Pilot Surveys are generally smaller replications of an actual survey employing statistically representative samples. Pilot surveys duplicate all or most components of the methodological design, sampling procedures and questionnaires of the full-scale survey to the extent possible, on a smaller scale. Piloting a survey provides some measure of confidence that the survey is operationally sound before the actual fielding of the survey. Analysis of response rates, item non-response, and response distributions from pilot surveys can identify problems with the survey, and they may be suggestive regarding the solutions to a problem, but they do not allow one to choose from among alternative solutions to a problem with certainty.
Split Panel Tests refer to controlled experimental testing of alternative hypotheses. Thus, they allow one to choose from among competing questions, questionnaires, definitions, error messages or survey improvement methodologies with greater confidence than any of the other methods. Split panel tests conducted during the actual fielding of the survey are superior in that they support both internal validity (controlled comparisons of the variable(s) under investigation) and external validity (represent the population under study). Nearly any of the previously mentioned survey improvement methods can be strengthened when teamed with this method.
1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
The primary function of the National Agricultural Statistics Service is to prepare and issue State and National estimates which include crop and livestock production, economic and environmental inputs, whole farm characteristics and operator demographics (covered by the Census of Agriculture) under the general authority of Title 7 U.S.C. Sec. 2204.
NASS is requesting the renewal of a generic clearance in order to respond quickly to emerging issues and data collection needs. The agricultural economy continues to change and NASS needs to continuously evaluate its surveys in light of these changes. Respondents continue to change (e.g., response rates decrease over time), technology continues to change (e.g., the Web quickly became a data collection option), and data needs continue to change. In addition, our understanding of how to improve surveys continues to evolve (e.g., the application of cognitive psychology to survey methodology has increased our understanding of surveys). The generic clearance structure allows NASS to meet these information needs using a means that minimizes respondent and administrative burden. Thus, NASS requests an ongoing OMB clearance structure in place to continue to improve the overall quality of its statistical surveys, to lessen the burden it places on respondents, and to shorten the time period between changes that affect surveys and NASS’ ability to formulate and update its surveys to address those changes.
NASS will report to OMB on an annual basis a summary of the projects conducted under this clearance and how the information gained will be used to improve NASS survey activities.
Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
The information obtained from these efforts will be used to develop new NASS surveys and improve current ones. Specifically, the information will be used to reduce respondent burden while simultaneously improving the quality of the data collected in these surveys. These objectives are met when respondents are presented with plain, coherent and unambiguous questionnaires that ask for data compatible with respondents’ memory and/or current reporting and record keeping practices. The purpose of the survey improvement projects will be to ensure that NASS surveys continuously attempt to meet these standards of excellence.
Improved NASS surveys will inform policy decisions on agriculture, as well as contributing to increased agency efficiency and reduced survey costs. In addition, methodological findings have broader implications for survey research and may be presented in technical papers at conferences or published in the proceedings of conferences or in journals.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g. permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.
NASS will employ information technology, as appropriate, to reduce the burden of respondents who agree to participate in its survey improvement projects. NASS may also explore the use of state-of-the-art technology (e.g., satellite TV or the Web) to conduct focus groups and other appropriate uses of technology, as yet unknown, to reduce burden on respondents. Web surveys have the potential to facilitate respondents’ data entry by performing automated tabulations and by providing feedback regarding errors in the reported data. These features potentially reduce the need for follow-up contact with respondents. However, the success of these features is dependent upon their design. Thus, one of the major motivations behind NASS’ request for generic clearance approval is to ensure that respondents to its Web surveys are presented with the most understandable and least burdensome instruments possible. In addition, NASS intends to explore the adoption of even more innovative methods to reduce respondent burden.
4. Describe efforts to identify duplication.
Generally, survey improvement projects will be undertaken to answer questions that have not yet been addressed or answered in the literature (this would include the circumstances in which NASS needs to ensure that questions or technologies that performed well for another survey work equally as well in a NASS survey). Also, the target population of farmers is not usually addressed in Non-NASS survey literature.
One of the major goals behind NASS’ push to improve its surveys and survey processes is to reduce the burden that farmers experience when they respond to NASS surveys. Thus, most survey improvement efforts may require a minimal amount of additional up front burden with the intent of producing much larger reductions of burden in following NASS surveys.
In the case of pilot surveys or split-panel tests, most survey samples will be designed with probabilities proportional to size to minimize burden on small farms/agribusinesses and to make sure that a large proportion of U.S. agriculture is captured. Thus, a larger farm/agribusiness will have a higher probability of being selected than a small farm/agribusiness.
6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
There are consequences to NASS’ not being able to conduct the survey improvement projects requested in this document. The quality of the data collected in current surveys may decrease, questionnaires and questions that are presently obsolete and poorly designed will remain obsolete and poorly designed, and over time, even those questionnaires that are well-designed and understood now may become obsolete. New items will be implemented without adequate testing and refinement. Inroads into our understanding of how agricultural producers answer surveys and how NASS can better serve them will slow, and NASS’ ability to develop timely new well-designed surveys will be diminished.
Additionally, this docket would replace the need for submitting a separate request for testing for each of the surveys mentioned in item 12.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner requiring respondents to report information to the agency more often than quarterly;
There are no special circumstances associated with this information collection.
Provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8 (d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments
Comments on this data collection effort were solicited in the Federal Register, Vol. 77, Number 244, pages 75,120 – 75,121, published on December 19, 2012. NASS received one public comment from Jean Public regarding this notice.
It is critical to the development or enhancement process for any survey that NASS ensures that the proposed survey questions can be answered by the target population, and that the questions are asked such that they provide for the most uniform comprehension possible. Respondent involvement in questionnaire development serves to ensure that respondents understand and can answer the survey questions, thus reducing overall respondent burden.
9. Explain any decision to provide any payment or gift to respondents.
Generally, NASS will only compensate respondents for travel costs related to focus groups or other similar field testing activities. Focus group participants will be provided between $50 and $75 to cover travel costs and incidentals. Any higher amounts will be justified on a case by case basis. Periodically, non-monetary tokens of appreciation such as baseball caps, pens, and other trinkets will be given to pre-testing participants.
Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
Respondents in the survey improvement projects will be advised that their participation is voluntary. Questionnaires include a statement that individual reports are kept confidential. U.S. Code Title 18, Section 1905 and U.S. Code Title 7, Section 2276 provide for the confidentiality of reported information. All employees of NASS and all enumerators hired and supervised under a cooperative agreement with the National Association of State Departments of Agriculture (NASDA) must read the regulations and sign a statement of compliance
Additionally, NASS and NASS contractors comply with OMB Implementation Guidance, “Implementation Guidance for Title V of the E-Government Act, Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA), (Public Law 107-347). CIPSEA supports NASS’ pledge of confidentiality to all respondents and facilitates the agency’s efforts to reduce burden by supporting statistical activities of collaborative agencies through designation of NASS agents; subject to the limitations and penalties described in CIPSEA.
11. Provide additional justification for any questions of a sensitive nature.
No questions of a sensitive nature are anticipated in work conducted under this generic clearance.
12. Provide estimates of the hour burden of the collection of information. The statement should indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated.
NASS estimates that total burden will not exceed 15,000 hours for all testing and evaluation of survey instruments and methodologies during the next three years. Burden will be used to test both Sampled Surveys and Census Follow-on Surveys.
No respondent would be contacted more than once in a calendar year for testing any of the survey improvement techniques, therefore, the number of respondents equals the maximum number of contacts covered by this clearance request. Sampling in any projects related to this docket will be to make methodological comparisons; not to generate substantive survey estimates.
Although many of these collections occur in alternating years, the total burden would not exceed the estimates provided in Table 1.
Table 1. Projected Total Response Burden for the next three years.
|
Number of Respondents |
Average Hours per Respondent |
Total Burden Hours |
Sample Surveys and Census Follow-on Surveys |
25,000 |
0.6 |
15,000 |
Examples of sample surveys that would be tested under this clearance include the June Area Survey, the Agricultural Resource Management and Chemical Use Surveys (ARMS), the Cash Rent Survey, and some other NASS reimbursable surveys. Following the 2012 Census of Agriculture, NASS will be conducting several follow-on surveys. In preparation for these follow-on surveys, NASS will need to conduct some testing. This additional testing is not covered by the Census Content Testing docket (0535-0243).
Total combined reporting cost to the public for all surveys is projected by multiplying 15,000 hours by $24 per hour to come up with an amount of $360,000.
13. Provide an estimate of the total annual cost burden to respondents or recordkeepers resulting from the collection of information.
There are no capital/startup or ongoing operation/maintenance costs associated with this information collection period.
14. Provide estimates of annualized cost to the Federal government; provide a description of the method used to estimate cost which should include quantification of hours, operational expenses (equipment, overhead, printing, and staff), and any other expense that would not have been incurred without this collection of information.
The annual cost to the Federal government generated by the survey improvement projects is estimated to be approximately $1,000,000. The main components of these costs are staff time and travel expenses. There is no start-up, equipment, operations or maintenance cost.
15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I (reasons for changes in burden).
The increase in burden hours and number of respondents are both due to program changes. NASS plans to conduct many more tests in the upcoming years to find ways to improve questionnaires along with ways to reduce sampling and potential sampling errors.
16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
Data will be collected to develop new surveys or improve the methodology of current surveys. Methodological findings may be published in the technical notes sections of the reports of survey data, in separate reports, in technical papers presented at conferences, in the proceedings of conferences, or in journals.
No substantive estimates will be published from these studies as findings will only be used for methodological comparisons.
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
No approval is requested for non-display of the expiration date.
18. Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions” of OMB Form 83-I.
No exceptions to the Certification Statement should be required. If so, OMB approval will be requested in advance of conducting the survey.
March, 2013
Revised April, 2013
1 NASS Information Quality Guidelines are available on http://www.usda.gov/nass/nassinfo/infoguide.htm. OMB Information Quality Guidelines are available on http://www.whitehouse.gov/omb/inforeg/infopoltech.html.
File Type | application/msword |
File Title | REQUEST FOR GENERIC CLEARANCE OF |
Author | mvburke |
Last Modified By | HancDa |
File Modified | 2013-04-26 |
File Created | 2013-04-26 |