Supporting Statement A for
Paperwork Reduction Act Submission
DOI Programmatic Clearance for Customer Satisfaction Surveys
OMB Control Number 1040-0001
OMB Terms of Clearance: None
Introduction:
The Department of the Interior (DOI) is requesting a three-year extension of its Programmatic Clearance for Customer Satisfaction Surveys, originally approved by the Office of Management and Budget (OMB) in January 2002. The Programmatic Clearance enables Interior bureaus and offices to conduct customer research through external surveys, such as questionnaires and comment cards. We collect this information to improve the services and products that DOI provides to the public and thus better carry out part of its statutory mission.
The existing Programmatic Clearance covers all of the Interior bureaus and offices in DOI and provides centralized oversight for all customer satisfaction surveys within Interior. The DOI Office of Policy Analysis will continue to conduct the necessary quality control, including assuring that each survey instrument comports with the guidelines of the Programmatic Clearance; DOI will submit each particular survey instrument for expedited review to OMB as we are ready to deploy a specific information collection.
Who Will Be Impacted by This Renewal?
Under the proposed Programmatic Clearance renewal, Interior will seek satisfaction information from its customers. DOI defines the term “Customer” in the following manner:
Customer: Anyone who uses DOI resources, products, or services. This includes internal customers (anyone within DOI) as well as external customers (e.g., the American public, representatives of the private sector, Tribes, academia, and other government agencies). Depending upon their role in specific situations and interactions, we may also consider citizens and DOI stakeholders and partners to be customers.
We define “Stakeholder” to mean groups, individuals, or Tribes who have an expressed interest in and who seek to influence the present and future state of DOI’s resources, products, and services.
We define “Partner” to mean those groups, individuals, Tribes, and government agencies who are formally engaged in helping DOI accomplish its mission, or with whom DOI has a joint responsibility or mission.
Justification:
1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.
Interior requests extension/renewal of its Programmatic Clearance for Customer Satisfaction Surveys so that we may better fulfill our Departmental or program-specific statutory mission as well as our government-wide responsibilities to provide excellence in government by proactively consulting with those we serve. Customer data are needed to:
Meet requirements of the Government Performance and Results Act,
Meet requirements of Executive Order 12862, and
Meet requirements of Executive Order 13571.
GPRA:
The Government Performance and Results Act (GPRA) of 1993 (Pub.L. No. 103-62) sets out to “improve Federal program effectiveness and public accountability by promoting a new focus on results, service quality, and customer satisfaction” (Section 2.b.3). In order to fulfill this responsibility, DOI’s bureaus and offices must collect data from their respective user groups to (1) better understand the needs and desires of the public and (2) respond to those needs and desires accordingly.
Executive Order 12862 -- “Setting Customer Service Standards”:
This Executive Order of September 11, 1993, is aimed at “ensuring the Federal Government provides the highest quality service possible to the American people.” The E.O. requires surveys as a means for determining the kinds and qualities of service desired by the Federal Government’s customers and for determining satisfaction levels for existing service.
Executive Order 13571 – “Streamlining Service Delivery and Improving Customer Service”:
This Executive Order of April 27, 2011, mandates “establishing mechanisms to solicit customer feedback on Government services and using such feedback regularly to make service improvements.”
2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. Be specific. If this collection is a form or a questionnaire, every question needs to be justified.
We collect this data to improve the service and products that the participating Interior bureaus and offices provide to the public, and thus better carry out part of our statutory mission. Interior managers use customer satisfaction data to support all aspects of planning, from buildings, roads, and interpretive exhibits to technical systems. In conducting their management, planning, and monitoring activities, managers also use the information to allocate effectively their limited personnel and financial resources to the highest priority elements. Bureaus and offices will use Form DI-4010, “Justification for Submission Under the “DOI Programmatic Clearance for Customer Satisfaction Surveys,”” to submit requests under this generic clearance.
Information collected under the Programmatic Clearance has led to improvements in the way DOI carries out its mission and serves the public. For example, a survey of individual landowners who worked with the Partners for Fish and Wildlife Program in the past helped the U.S. Fish and Wildlife Service (USFWS) more effectively and efficiently manage the program and collaboratively meet the needs of private landowners. A pilot permitting program by the USFWS collected important feedback from affected stakeholders regarding the management and administration of the permitting process associated with conflicts related to an overabundant bird species. Another USFWS survey of program participants in the Youth Crew Program provides Refuge managers and planners with scientifically sound data that can be used to identify and implement appropriate improvements to the design of the Program in order to increase student participation and engagement with events and programs offered at the Refuge. Customer feedback has helped the Bureau of Land Management (BLM) improve the timeliness of several of its permitting processes. A survey of visitors to USFWS refuges identified key issues impacting visitor satisfaction, including satisfaction with refuge law enforcement and differences in satisfaction between races that visit refuge sites. Information collected by the U.S. Geological Survey (USGS) regarding existing and proposed transportation systems at Kilauea National Wildlife Refuge are being used to design alternative means for transporting visitors to the Refuge in response to visitor dissatisfaction with crowded parking areas. A survey conducted by the Bureau of Reclamation (BOR) has helped Reclamation managers identify satisfaction levels with water delivery systems, and to identify improvements.
DOI anticipates that the information obtained will continue to be used in revising certain bureau processes and policies, and developing guidance related to the bureau’s customer services. These changes will continue to improve the services DOI provides to the public and, in turn, allow DOI to carry out more effectively a part of its statutory mission.
The Programmatic Clearance is intended to collect customer satisfaction data only. Therefore, survey instruments that will be approved under the authority of the clearance must focus on customer satisfaction data. No information collection instruments seeking to collect information beyond the scope of customer satisfaction data, such as social-science or visitor-use information, will be considered under the scope of this Programmatic Clearance.
Interior’s Programmatic Clearance for Customer Satisfaction Surveys is further limited to non-controversial information collections that do not attract attention to significant, sensitive, or political issues. Examples of significant, sensitive, or political issues include seeking opinions regarding political figures; obtaining citizen feedback related to high-visibility or high-impact issues like the reintroduction of wolves in Yellowstone National Park, the delisting of specific Endangered Species, or drilling in the Arctic National Wildlife Refuge.
All information collection instruments will be designed and deployed based upon acceptable statistical practices and sampling methodologies, and will be used to obtain consistent, valid data that are representative of the target populations and account for non-response bias, in conformance with OMB “Guidance on Agency Survey and Statistical Information Collections (January 20, 2006).”
Allowable Information Collection Methods:
In-person intercept surveys: In a face-to-face situation, the survey instrument is provided to a respondent who completes it while on site and then returns it. This may include oral administration or the use of electronic technology and kiosks. The survey proctor is prepared to answer any questions the respondent may have about how to fill out the instrument but does not interfere or influence how the respondents answer the questions.
Telephone interviews or questionnaires: Using existing databases, an interviewer will contact customers who have had a specific experience with the agency. The interviewer will dial back until the customer has been reached. Once contacted, the survey respondent is given a brief introduction to the survey, including its importance and use. The interviewer will then expeditiously move through the survey questions.
Mail and e-mail surveys: Using existing lists of customer addresses, a three contact-approach based on Dillman's “Tailored Design Method” will be employed. The first contact is a cover letter explaining that a survey is coming to them and why it is important to the agency. The second contact will be the survey instrument itself along with a postage-paid addressed envelope to return the survey. The third contact will be a reminder postcard sent 10 days after the survey was sent. Finally, the respondents will receive a letter thanking them for the willingness to participate in the survey and reminding them to return it if they have not already done so. At each juncture, the respondents will be given multiple ways to contact someone with questions regarding the survey (including phone, FAX, web, and email). If the survey has been lost, the respondent can request that another be sent to them. Electronic mail is sometimes used instead of postal mail to communicate with customers. Although this is a cost-effective mode to survey a large group of people, it does not usually generate the best response rate. Telephone calls to non-respondents can be used to increase response rates.
Web-based: For products or services that are provided through electronic means, whether e-commerce or web-based information, a web or email survey may be most appropriate. During the course of their web interaction, users can volunteer to add their name to a list of future surveys. From this list, a respondent pool will be selected in accordance with the sampling procedures outlined above. An email will be sent to them explaining the need for and importance of the survey with a web link to the survey. Within 5 days, a follow-up email will be sent to the respondents reminding them to complete the survey. Finally, the respondents will receive an email thanking them for the willingness to participate in the survey and reminding them to complete it if they have not already. The respondent will always have the option to submit the survey in paper form, should they elect to do so.
Focus groups: Some data and information are best collected through more subjective, conversational means. A focus group is an informal, small-group discussion designed to obtain in-depth qualitative information. Individuals are specifically invited to participate in the discussion, whether in person or through technologically enhanced means (e.g., video conferencing, on-line sessions). Participants are encouraged to talk with each other about their experiences, preferences, needs, observations, or perceptions. A moderator whose role is to foster interaction leads the conversation. The moderator makes sure that all participants are encouraged to contribute and that no individual dominates the conversation. Furthermore, the moderator manages the discussion to make sure it does not stray too far from the topic of interest. Focus groups are most useful in an exploratory stage or when the bureau/office wants to develop a deeper understanding of a program or service. Using the best in focus group research practices, groups will be constructed to include a cross-section of a given customer group.
Comment Cards: Comment cards, when provided to a customer at the time a product or service is provided, offer an excellent means to give the bureaus and offices feedback. A comment card should have a limited number of questions and an opportunity to comment. These comment cards provide managers and service providers with direct and specific information from their customers that could not be obtained through any other means.
Electronic users may be offered the opportunity to complete a comment card via a pop-up window (or other web-enabled means that may be available). The pop-up window will not appear for every user; rather, the users will be selected randomly to receive the survey. This practice is widely used in private industry. In other instances, the electronic user may be offered the option to self-select in answering the electronic comment card.
Whether using paper or electronic comment cards, the intent is to provide a feedback mechanism. The data are not intended to be statistically significant. Although questions may include numeric scales, those data should be considered only in an anecdotal fashion and not reported as a significant measure.
Types of Questions Asked:
There are six topic areas in which the participating bureaus and offices would obtain voluntary information from their customers. It is not expected that any one survey will cover all the topic areas; rather, these topic areas serve as a “guideline menu” from which the agencies will develop their questions. Under the information collection process, the Department, its offices, and bureaus will develop questions that fit within the generally understood confines of the topic area. Questions may be asked in languages other than English (e.g., Spanish), where appropriate.
(1) Delivery, quality and value of products, information, and services:
The range of questions envisioned for this topic area will focus on customer satisfaction with aspects of information, products, and related services offered by DOI. Information, products, and services include written reports, press releases, computer modules, workshops and seminars, or technical assistance. Respondents may be asked for feedback regarding the following attributes of the information, service, and products provided:
Timeliness (Was the information, service, product provided to you in a timely manner?) (Was the information itself timely?)
Consistency (Was the quality of the service consistent with your expectations?)
Accuracy (Were the data provided accurate?)
Ease of Use and Usefulness (Was the product easy to use? Was the information useful to you?)
Ease of Information Access (Were you able to find the information you needed easily?)
Helpfulness (Was the information helpful?)
Quality (Was the information of high quality?)
Value for fee paid for information/product/service (Was the cost of the product, information, or service appropriate for the value?)
(2) Administrative practices:
This area covers questions relating to how well customers are satisfied with Interior administrative practices and processes, what improvements they might make to specific processes, and whether or not they feel specific issues were addressed and reconciled in a timely, courteous, responsive manner. Questions within this area may involve feedback regarding how well Interior engaged customers on a specific topic. They may also seek opinions from customers regarding how well Interior programs are administering specific processes (for example, the Bureau of Land Management may ask customers how well it is administering its permitting processes.)
(3) Mission management:
Questions will ask customers to provide satisfaction data related to Interior’s ability to protect, conserve, provide access to, provide scientific data about, and preserve natural, cultural, and recreational resources that we manage, and how well we are carrying out our trust responsibilities to American Indians, Native Alaskans, and Insular Areas. Questions will specifically ask customers to provide satisfaction data related to each of its four mission areas as described in its GPRA Strategic Plan: Resource Use, Resource Protection, Serving Communities, and Recreation.
(4) Rules, regulations, policies:
This area focuses on obtaining feedback from customers regarding fairness, adequacy, and consistency in enforcing rules, regulations, and policies for which Interior is responsible. It will also help us understand public awareness of rules and regulations and whether or not they are explained in a clear and understandable manner. It will not seek opinions from customers regarding the appropriateness of regulatory rulings themselves.
(5) Interactions with DOI Personnel and Contractors:
Questions developed under this topic area will focus on obtaining customer feedback regarding attributes of interactions with Interior office and bureau employees, as well as Interior contractors. Attribute questions will range from timeliness and quality of interactions to skill level of staff providing the assistance, as well as their courtesy and responsiveness during the interaction.
(6) General demographics:
Some general demographics may be gathered to augment satisfaction questions in order to better understand the customer so that we can improve how we serve that customer. Demographics data will range from asking customers how many times they have used an Interior service or visited an Interior facility within a specific timeframe, to their ethnic group and race. Sensitivity and prudence will be used in developing and deploying questions under this topic area so that the customer does not perceive an intrusion upon his/her privacy. Additionally, these questions will only be asked as long as the data are critical to understanding customer satisfaction and the character of the customer base. Demographics may also be used as part of a non-response bias strategy to ensure responses are representative of the contact universe.
How are data used?
Managers and program specialists use these data to identify:
Service needs of customers
Strengths and weaknesses of services
Ideas or suggestions for improvement of services from our customers
Barriers to achieving customer service standards
Changes to customer service standards
Baselines to measure change in improving service delivery over time
Improving public trust in government.
All submissions under this generic clearance will include a completed Form DI-4010, Justification for Submission under the “DOI Programmatic Clearance for Customer Satisfaction Surveys”, as well as copies of survey instruments, email messages to respondents, instructions, and other supporting documents appropriate for the submission.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden and specifically how this collection meets GPEA requirements.
When possible, information technology will be used to reduce the burden on the public and to comply with requirements of the Government Paperwork Elimination Act (GPEA). Email messages introduce and distribute information collection instruments to a sample of customers. In some cases, the instruments may be web-enabled so that respondents can complete them online, enabling the response analysis to be automated. We estimate that over 75% of our responses will be collected electronically. In all cases, we will use appropriate non-response bias strategies to ensure that responses are representative of the contact universe. We will not use electronic media to select sampling frames for surveys conducted under the Programmatic Clearance.
4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.
This effort does not duplicate any other survey conducted by other Federal agencies. Other Federal agencies are conducting user surveys but are not soliciting comments on the delivery of DOI or DOI bureau/office products and services.
5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
One of the main purposes of this effort is to gather information needed without putting a significant additional burden on small entities. We minimize the use of sampling and the number of questions on the surveys, as much as possible. Use of electronic means of surveying also has the potential to reduce the burden on small entities.
6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
Without this information collection, DOI will not be able to determine the kind and quality of service customers want, their level of satisfaction, or ways to improve customer service in a timely manner. The Programmatic Clearance for Customer Satisfaction Surveys enables bureaus and offices within the Department of the Interior to collect customer satisfaction data in an expedited manner. The generic clearance also enables us to respond to customer concerns in a manner that better meets customer expectations.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
* requiring respondents to report information to the agency more often than quarterly;
* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
* requiring respondents to submit more than an original and two copies of any document;
* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;
* in connection with a statistical survey that is not designed to produce valid and reliable results that can be generalized to the universe of study;
* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
* requiring respondents to submit proprietary trade secrets, or other confidential information, unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
This renewal request contains no special circumstances with respect to 5 CFR 1320.5 (2) (i) and (iii-viii) with the exception of (ii). We may be asking respondents to send back their responses in fewer than 30 days after receipt of the survey. On these types of surveys, respondents normally will respond rather quickly if they intend to respond at all. The first survey (for each bureau’s customer line) will provide our baseline data, and we will consider all responses until the time we make the final baseline report. These are voluntary surveys and respondents, of course, are not obligated to respond.
8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and in response to the PRA statement associated with the collection over the past three years, and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every three years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
On April 4, 2024, we published in the Federal Register (89 FR 23607) a notice of our intent to request that OMB approve this information collection. In that notice, we solicited comments for 60 days, ending on June 3, 2024. We received no comments in response to that notice.
We surveyed the DOI bureau information collection clearance officers about the importance to their bureaus of continuation of this OMB programmatic approval. They all expressed strong support for its extension.
9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
Incentives, remuneration, and gifts are generally deemed inappropriate as part of plans for information collections conducted within the scope of the Programmatic Clearance for Customer Satisfaction Surveys. In some cases, the provision of gifts and incentives to customer satisfaction survey respondents may appear to be a conflict of interest. However, there may be extraordinary circumstances under which remuneration may be appropriate within the scope of this program. In the cases of information collections that seek to use incentives under extraordinary circumstances, Interior program managers must describe the proposed incentive, how it will be offered to respondents, and justify its use on the completed Form DI-4010, which is required as part of each information collection request under this package.
10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
Confidentiality issues in each proposed information collection instrument will be assessed and addressed in accordance with OMB guidance as provided in “Guidance on Agency Survey and Statistical Information Collections (January 20, 2006).”
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
The questions used on these surveys will not be of a sensitive nature.
12. Provide estimates of the hour burden of the collection of information. The statement should:
* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.
* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.
* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here.
Based on historical usage of this Programmatic Clearance, we estimate there will be approximately 65,000 annual respondents and that each response will average 10 minutes per response, for a total of 10,833 annual burden hours. We estimate the annualized cost of this collection to be $499,835 (rounded) (10,833 hours x $46.14). We anticipate the average completion time for the various types of submissions under the control number to average approximately 10 minutes each.
We used the of Bureau of Labor Statistics (BLS) News Release USDL-24-1172, June 18, 2024, Employer Costs for Employee Compensation—March 2024, to calculate the cost of the total annual burden hours. Table 1 of the News Release lists the hourly rate for all civilian workers as $46.14, including benefits.
Estimated Annual Reporting Burden |
|||||
Type of Collection |
No. of Respondents |
Annual Frequency per Response |
Total Annual Responses |
Time per Response |
Total Annual Hours* |
DI-4010, Justification for Submission Under the “DOI Programmatic Clearance for Customer Satisfaction Surveys” |
|||||
|
65,000 |
1 |
65,000 |
10 minutes |
10,833 |
Totals: |
65,000 |
|
65,000 |
|
10,833 |
13. Provide an estimate of the total annual non-hour cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected in item 12.)
* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information (including filing fees paid for form processing). Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.
* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.
* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.
We have identified no reporting and recordkeeping “non-hour cost” burdens associated with this proposed collection of information.
14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.
We estimate the annual cost to the Federal government to be $2,818,638 (rounded) (10,833 burden hours x 3 hours (Federal time burden per survey) x $86.73).
The bureaus and offices will, on a case-by-case situation, determine if it is more efficient and cost effective to develop, distribute, collect, and analyze these surveys in-house or to turn to private or other non-government entities to provide that service. However, in order to provide a reasoned estimate, we have assumed a ratio of three hours of Federal work for each hour of survey time, with the Federal work at an average pay level of GS-12 step 5, which is $53.87 per hour based on OPM Salary Table 2024-DCB. In accordance with BLS News Release USDL-24-1172, we multiplied the hourly rated by 1.61 to calculate the fully burdened hourly rates, resulting in a fully burdened rate of $86.73.
15. Explain the reasons for any program changes or adjustments in hour or cost burden.
We are not reporting any program changes or adjustments in hour or cost burden.
16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
Each information collection considered under the Programmatic Clearance will describe how the data will be used and provide to OMB the specific tabulation methods to be used to synthesize, analyze, and aggregate the data collected. The data will be gathered primarily for internal DOI use, so it is not expected that such data will be published. However, if the results of a particular survey are to be published or otherwise made public, that fact will be disclosed in the completed Form DI-4010 for that survey.
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
We will display OMB’s expiration date on the information collection instruments.
18. Explain each exception to the topics of the certification statement identified in "Certification for Paperwork Reduction Act Submissions."
We are requesting no exceptions to the certification statement.
–
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Statement A for |
Author | US Department of the Interior |
File Modified | 0000-00-00 |
File Created | 2024-07-24 |