Supporting Statement

Supporting Statement.doc

Industry Partership Survey

OMB: 0702-0122

Document [doc]
Download: doc | pdf

Supporting Statement for Voluntary Industry Partnership Survey to Implement Executive Order 12862 and 5 U.S.C. 305 for the Surface Deployment and Distribution Command (SDDC)

  1. Justification

  1. Need for the Information Collection

To comply with Executive Order 12862, Setting Customer Service Standards (the EO), the Surface Deployment and Distribution Command (SDDC) systematically surveys its contractors to better understand how they feel about our acquisition processes, and to improve the way we conduct business with them.

In addition to the EO, 5 U.S.C. 305(b), “Systematic agency review of operations” states “each agency shall review systematically the operations of each of its activities, functions, or organizational units, on a continuing basis.” The purposes of the reviews “include determining the degree of efficiency and economy in the operation of the agency’s activities, functions, or organization units.” These surveys form a part of that review for acquisition activities within SDDC.


SDDC provides global surface deployment and distribution operations to meet the nation’s objectives. They are working to be the Warfighter's single surface deployment/distribution provider for adaptive and flexible solutions delivering capability and sustainment on time.


To carry out this mission and vision, SDDC works with industry partners in several program areas

  • Global Domestic Distribution Program

  • Freight Global Distribution Program

  • Personal Property Traffic Management Program

  • Transportation Engineering Agency

  • Arm Ammunition & Explosives

  • Deployment Support Teams

  • Global Container Management


Most industry partners only provide services in one or two of the program areas, so the survey design provides for transparently skipping respondents only to the sections that are relevant to them. In the fifth round of surveys, the core questions (about 95 percent of the total) will be identical, while the current topical questions (the remaining 5 percent) may be replace by topical questions which are more relevant 12 months later, at the time the next survey round is conducted.




To make performance improvements in the operations of these program areas, SDDC plans to undertake voluntary surveys of our “partners” in industry for three years from the approval/renewal date. To comply with Executive Order 12862, Setting Customer Service Standards (the EO), SDDC plans to systematically survey its contractors to better understand how they feel about our acquisition processes and to improve the way we conduct business with them.

These voluntary partnership surveys will continue to be a collaborative effort, with SDDC conducting Web-based surveys of its partners to obtain feedback for improving our business processes. The contractors to be surveyed are sufficiently familiar with SDDC to make this feedback extremely useful. These surveys will give us an opportunity to better understand and evaluate our support to the warfighter’s needs, as well as to incorporate best industry or public sector standards into our practices.

The SDDC goal is to promote this survey effort as a useful self-assessment, self-improvement, and benchmarking tool, while ensuring that data reliability is maintained.

2. Use of the Information

The information gained from the results and analysis of this survey will be used in conjunction with the data obtained from an internal customer survey. These surveys will help us identify actionable items to be used to improve services to the warfighter and further the objective of our customer relationship management (CRM) program and strategic plan. By using skip patterns in a Web-based survey, each industry partner respondent will only be asked questions that are relevant to the services that it provides.

Specifically, SDDC considers the following three criteria in selecting improvement areas: (1) low performance scores (i.e., performance gap analysis); (2) high importance to management; and (3) likely success (e.g., “low-hanging fruit”). Moreover, we consider the following three factors in developing and implementing our organizational improvement efforts: (1) linking survey results to CRM and the strategic plan; (2) being consistent with resource availability; (3) forming project improvement teams; and (4) recognizing and rewarding improvement efforts.

The application of the above criteria and factors will result in the identification of qualitative performance gaps, and the selection of appropriate and realistic improvement targets for each program area. This analytical approach will culminate in concrete improvements to our acquisition systems. The same methodology will be used in the future to help build organizational momentum for continuous improvement. Further, we will continue to properly use ordinal data and frequency distributions to target improvement efforts and gauge performance trends over time.

The SDDC will use the survey information to improve the efficiency, quality, and timeliness of its processes, as well as to strengthen its partnership with industry. Although the survey instruments are brief—with only basic information requested to measure satisfaction and to obtain feedback on areas that may require improvement—we expect the data, comments, and suggestions offered by our respondents to help improve the performance of our systems and contain costs. Because the surveys ask about the roles of our employees, the responses will also help improve our exercise of project oversight responsibilities. Finally, these surveys will help SDDC comply with the EO and 5 U.S.C. 305.

Each survey item of data will be initially entered by the respondent electronically into CSV (standard comma separated) value file, which is then downloaded into a Microsoft Access database, and further analyzed by using SPSS or SAS software for managing the database, tabulating the data, and developing special tables and charts to present the data. Given the limitations of ordinal data, the frequency distributions of the survey responses will be the primary approach. We will highlight those sub elements showing lower levels of satisfaction for management review as prime candidates for our continuous improvement process.

In addition, we will do an analysis of whether differences in categories of respondents’ backgrounds help explain differences in responses to individual questions. For example, we could test the data to determine if industry partner satisfaction varies by type of carrier (e.g., air, truck, or rail). This information will help our offices to better target opportunities for performance improvements.

In addition, aggregate survey information will be used to help implement critical SDDC management objectives as well as to develop improvement targets that are linked to our CRM goals and strategic plan.

3. Use of Information Technology

We will conduct our proposed survey as a Web-based instrument. This Web-based survey approach is now the norm. The Web makes it easier for our contractors to complete the surveys, and reduces the survey administration burdens and costs. Surveys on the Web also provide for easy and transparent skipping patters to reduce the burden on respondents. The SDDC establishes a unique Internet address for each survey effort.

We believe that our electronic survey technique represents a productive use of improved technology, which will benefit performance management at SDDC. In addition, we will use standard computer programs to help us process the collected information.

For those industry partners who are unable to access the Internet, we make the survey available in paper form and provide a data input service.

4. Non-Duplication

We do not have or collect similar information from other sources. We will be vigilant in maintaining survey administration quality and avoiding duplication of effort.

5. Burden on Small Businesses

The survey instruments are brief, with only very basic information requested to measure satisfaction and to obtain feedback on areas that may require improvement. To minimize Web-based or mail respondent burdens, we have formulated questions simply and directly, used close-ended (not open-ended) questions, made the questionnaire answerable within 15 minutes, grouped questions into categories for ease of response, and pretested the questionnaire to ensure minimal burden. While small entities continue to be important industry partners for us, we do not foresee any significant economic impact on them from conducting these surveys.

6. Less Frequent Collection

The internal plan for industry partnership surveys call for them to be completed every 12 months. This schedule allows improvement efforts to be identified and incorporated into our strategic management planning cycle.

Without collecting this information on a regular basis, we will have limited knowledge of areas that may need improvement and of the priorities our industry partners place on potential improvement efforts. In addition, absent this data, we will be unable to fully develop appropriate service improvement plans for raising satisfaction levels. Only by collecting this critical performance information will we have the opportunity to: identify performance gaps; select improvement targets; convert survey results into organizational improvements; assess improvement efforts; track performance progress and industry partner satisfaction over time; establish performance benchmarks; and meet key CRM goals and strategic initiatives. Lastly, this information is necessary for compliance with the EO and 5 U.S.C. 305.

7. Paperwork Reduction Act Guidelines

We are using a voluntary survey and requesting timely responses, and there are no special circumstances requiring additional justification.

8. Consultation and Public Comment

a. The 60-day notice published on May 5, 2008 (73 FR 24575). No comments were received.

b. In addition, SDDC remains in close contact with industry to obtain their views on the availability, disclosure, and reporting of survey information. In addition, we have thoroughly pretested the survey instrument with outside potential respondents to ensure that it is clear, reasonable, and free of undue burdens.

9. Gift or Payment

Respondents receive no remuneration for completing surveys.

10. Confidentiality

Through the SDDC-developed survey cover letter and the face page of the survey instrument, the contracting office will assure survey respondents that their individual responses will not be reported—thus, helping to achieve a high response rate. Response aggregates are adequate for complying with the EO and 5 U.S.C. 305. The surveys are voluntary, and an independent third party, such as LMI, a not-for-profit government consulting research institute, administers all Web-based surveys. For current or prospective industry partners, background information is limited to such items as the SDDC office the industry partner performs business with, business category, the type of service offered by the industry partner, and the number of years of doing business with us.

11. Sensitive Questions

No sensitive information is requested.

12. Respondent Burden, and its Labor Costs

  1. Estimation of Respondent Burden: SDDC expects to conduct 3 Web-based industry-partner surveys on a 12-month cycle over the next 3 years.

Number of Respondents: 1,371

Responses Per Respondent: 1

Annual Responses: 457

Average Burden Per Response: 15 minutes

Annual Burden Hours: 114.25 Over a three-year period 343 hours

  1. Labor Cost of Respondent Burden: In addition, we expect each respondent to incur an average cost of less than $5.00. The costs to the agency and to the public are considered very low.

13. Respondent Costs Other Than Burden Hour Costs

None. Since this is an attitudinal survey on partner satisfaction, existing reporting and record keeping practices are more than sufficient. There are no additional records required for this survey, and no record retention effects.

14. Cost to the Federal Government

We estimate that ¼ of an existing FTE will be used, per 14-month cycle, to conduct each survey effort (including identifying the universe of potential survey recipients, compiling and analyzing data, reporting on results, etc). Because we plan to use primarily a Web-based survey approach, we will not incur significant printing and mailing costs for each survey effort.

In addition, SDDC expects to incur an average of about $57,500 per survey effort for survey consultant support (survey conduct, report, and analysis both overall and individually for the seven program areas and support linking identified improvement areas to SDDC strategic objectives) from LMI or similar provider—to help us improve the electronic processing and analysis of performance data.

15. Reason for Change in Burden

This is an extension of a previously approved collection.

Other changes:

SDDC is changing the collection period from a 14-month cycle to an annual cycle. This schedule is critical for aligning analysis and improvement initiatives with our strategic management planning cycle.

Response projections have been reduced from 80% to 65%. SDDC is committed to obtaining high response rates, designating a period of seven weeks for its survey conduct and using Dillman’s Tailored Design Method for conducting Web-based surveys to balance costs against the need for adequate response rates. Using this method, SDDC contacts survey participants several times to ensure high response rates, targeted at 80 percent, and tracks this information electronically throughout the conduct period.

Post-survey analysis shows that responses for the target population have increased progressively over the past 3 cycles, to the 2007 rate of 54% (685/1264). Due to the transient and commercial nature of this group, the 80% target is difficult to achieve. The major obstacle to achieving this rate is the SDDC sourcing of email distribution lists that include high percentages of outdated, inaccurate, and invalid commercial email addresses. SDDC is working to improve the industry list generation process through identification of more current sources for this data and improved scrubbing procedures to remove non-valid addresses. Using these procedures, we estimate we can improve rates to about 65% over the next few survey cycles.

16. Publication of Results

The results of collecting this information will not be published.

17. Non-Display of OMB Expiration Date

The approval number and expiration date will be placed at the front of the electronic survey.

18. Exceptions to “Certification for Paperwork Reduction Submissions”

No exceptions are taken to the certification statement.



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

This effort will not employ sampling as the population is relatively small and we need sufficient responses to analyze the data by program area.

File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorLMI
Last Modified ByPatricia Toppings
File Modified2008-07-31
File Created2004-08-24

© 2024 OMB.report | Privacy Policy