Supporting Statement 03-24-08 Part B

Supporting Statement 03-24-08 Part B.doc

Chemical Stockpile Emergency Preparedness Program (CSEPP) Evaluation and Customer Satisfaction Survey

OMB: 1660-0057

Document [doc]
Download: doc | pdf

B. Collections of Information Employing Statistical Methods.



When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:

  1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.

Open-ended Questionnaire: (State and local officials)

There are 170 State and local officials in emergency management-related positions whose input is relevant to this information collection. These positions have been categorized as follows:

Table 1: State and Local Officials Universe (Open-ended Questionnaire)

Respondents’ Position/Organization

CSEPP Jurisdiction

Total

State

County

City/Town

Emergency Management

9

38


47

Health Departments

9

 

 

9

Commissioners


38


38

Responder Organizations

 

38

38

76

Total

18

114

38

170

Site Surveys: (Residents)

There are approximately 368,787 persons 18 years old or older residing inside an Emergency Planning Zone (EPZ) at the Pine Bluff Arsenal and the Anniston Army Depot, a CSEPP-designated geographic region located in the immediate or surrounding areas of these sites. EPZs are further subdivided into an Immediate Response Zone (IRZ) and a Protective Action Zone (PAZ), each carrying specific levels of protective actions in the event of a disaster.

Table 2 shows the collection’s universe, sample size, margin of error, and sampling techniques for each of the two site surveys.

Table 2: Universe Populations, Sample Size, Margin of Error, and Sampling Technique (Site Surveys)

CSEPP Site Name

Total Sample or Name of Strata

Geographic Region

Universe: Population over 18 who are inside the geographic region

Margin of Error

Sample Size

Sampling Technique

Pine Bluff Arsenal

Total sample

Depot, IRZ, PAZ

139,878

3.4%

1,093

Stratified Sample

Pine Bluff Arsenal

Grant IRZ

Depot, IRZ, PAZ

1,219

 

55

Stratified Sample

Pine Bluff Arsenal

Jefferson IRZ

Depot, IRZ, PAZ

17,353

 

300

Stratified Sample

Pine Bluff Arsenal

Arkansas PAZ

Depot, IRZ, PAZ

751

 

30

Stratified Sample

Pine Bluff Arsenal

Cleveland PAZ

Depot, IRZ, PAZ

3,707

 

31

Stratified Sample

Pine Bluff Arsenal

Dallas PAZ

Depot, IRZ, PAZ

525

 

30

Stratified Sample

Pine Bluff Arsenal

Grant PAZ

Depot, IRZ, PAZ

10,977

 

86

Stratified Sample

Pine Bluff Arsenal

Jefferson PAZ

Depot, IRZ, PAZ

44,676

 

218

Stratified Sample

Pine Bluff Arsenal

Lincoln PAZ

Depot, IRZ, PAZ

3,171

 

31

Stratified Sample

Pine Bluff Arsenal

Lonoke PAZ

Depot, IRZ, PAZ

6,642

 

52

Stratified Sample

Pine Bluff Arsenal

Prairie PAZ

Depot, IRZ, PAZ

29

 

0

Stratified Sample

Pine Bluff Arsenal

Pulaski PAZ

Depot, IRZ, PAZ

12,878

 

95

Stratified Sample

Pine Bluff Arsenal

Saline PAZ

Depot, IRZ, PAZ

37,950

 

165

Stratified Sample

Anniston Army Depot

Total sample

Depot, IRZ, PAZ

228,909

3.4%

961

Stratified Sample with Over sampling

Anniston Army Depot

Calhoun

Depot, IRZ, PAZ

85,790

 

310

Stratified Sample with Over sampling

Anniston Army Depot

Clay

Depot, IRZ, PAZ

482

 

50

Stratified Sample with Over sampling

Anniston Army Depot

Cleburne

Depot, IRZ, PAZ

4,136

 

100

Stratified Sample with Over sampling

Anniston Army Depot

Etowah

Depot, IRZ, PAZ

61,048

 

221

Stratified Sample with Over sampling

Anniston Army Depot

St. Clair

Depot, IRZ, PAZ

48,327

 

175

Stratified Sample with Over sampling

Anniston Army Depot

Talladega

Depot, IRZ, PAZ

29,126

 

105

Stratified Sample with Over sampling

TOTAL

 

 

368,787

 

2,054

 


Actual Response Rates:

Open-ended Questionnaire: Historical response rate for this survey has been high at 67 percent, and it is expected to remain at that level. Major variances in response have not been observed among the different CSEPP sites.

Site Surveys: Response rates are calculated using final disposition codes and response rate formulas published by the American Association for Public Opinion Research (AAPOR). 1 The last information collection for participating sites within 2006-2007 include 2007 Anniston, 2007 Pine Bluff, 2006 Pueblo and 2006 Umatilla site surveys. The least (RR1) average response rate for the surveys was approximately 17% and the average response rate RR5 (excluding the households with unknown eligibility) was 27%.

A significant portion (57.42%) of non-response is due to non-contacts and those with unknown eligibility. Due to technological advances (caller ID, answering machines, mobile phones) households are often hard to contact by wire line telephone. As of the end of 2006, government statistics indicated that 12.8% of U.S. households, with 11.8% of all adults, could be reached only by cell phone. Also, there has been a recent push for portability of telephone numbers, which allows residents to keep their phone number as they relocate to another provider, wire line or wireless, in the same general region. In an effort to increase the response rates, advanced screening methodologies available will be employed for future surveys to increase the productivity of the sample telephone numbers.

In the last submission, OMB suggested the possibility of piloting a mail survey to potentially increase the response rates. A mail-in questionnaire was used in the Newport Chemical Depot. For the Newport mail survey, previous research in the geographical area suggested a response rate of approximately 10% for mail-in surveys. However, for the CSEPP surveys, the response rate for the Newport site turned out to be approximately 7%. As per terms of clearance, the final report from the Newport mail survey has been included in the package. We have shortened and clarified a few questions in the survey as well to help with the likelihood of respondents answering.

The survey campaign has been conducted for several years. Because of the repetitive nature of this survey research, there are some communities where nearly every resident has participated in the survey at one time or another. This can have an adverse effect on participation in subsequent surveys. So aside from considerable rates of non-contact and unknown eligibility common in Random Digit Dialing (RDD), the survey effort is also experiencing increasing refusal rates at some sites.

When conducting the CSEPP telephone surveys, we make the assumption that non-response is independent of answers to questions on the questionnaires. Essentially we assume non-response is missing at random. We have checked this assumption by comparing the demographic percentages in the survey against U.S. Census data and past survey results. However, this decline in response rate requires attention and needs to be addressed appropriately. The following techniques will be adopted in conducting future surveys to increase the response rate by a certain amount:

  1. Send a Pre-Notification Letter –This method has already been adopted by some sites such as Umatilla, and a difference in the response pattern is observed for Umatilla compared to the other sites.

  2. Number of Call Attempts – CR Dynamics’ Computer Assisted Telephone Interviewing (CATI) system allows for the scheduling of callbacks either automatically or for a time determined by the interviewer’s contacts with a survey respondent. Telephone numbers are generally attempted up to three times. This number can be increased as the refusal rates have increased.

  1. Describe the procedures for the collection of information including:

    1. Statistical methodology for stratification and sample selection,

There are a total of seven CSEPP sites in the United States. Of these, six sites (AL, UT, IN, AR, CO, and OR) have actually conducted a residential survey within the past five years. At this time only Pine Bluff, AR and Anniston, AL are interested in conducting future surveys.

  • Survey Respondents:

    • State and Local Officials (Open-ended Questionnaire): The universe consists of 170 State and local officials in tactical and decision-making capacities related to emergency management in eight CSEPP states. Due to its small size, no sampling is conducted but rather a census of this population is taken.

    • Residents (Site Surveys): Total sample consists of 2,054 residents meeting the following selective criteria:

      • Live inside the Emergency Planning Zone (EPZ) “footprint” of a participating CSEPP site. Emergency Planning Zones are further subdivided into an Immediate Response Zone (IPZ) and a Protective Action Zone (PAZ).

      • 18 years of age or older

Sampling Design and Methodology

The participating CSEPP sites (Anniston and Pine Bluff) have elected to use Simple Random Sampling (SRS) and Stratified Sampling. Samples have been stratified by State, county, and/or EPZ boundaries. Statisticians at IEM help CSEPP sites choose a sampling design that will most effectively represent their population and accomplish the specific goals of their survey.

For zones, States, and/or counties with relatively small populations within the geographical region defined to be sampled, CSEPP sites have elected to use the technique of over-sampling to collect a large enough sub-sample to make valid statistical comparisons with other sub-groups.2 When over-sampling is employed, the results of the over-sampled regions are weighted back to match their respective population proportion.

Sample sizes for each site have been calculated with standard statistical formulas.

Sampling Frame: The sample frame includes only those residents located inside the EPZ footprint for each CSEPP site. RDD is used to include both listed and unlisted numbers reducing the bias of traditional telephone directory sampling. To ensure that residents outside of the EPZ are not included in the sampling frame, Genesys purges the listed phone numbers geographically. This means that for the portion of the RDD sample representing listed households or businesses, addresses will be matched against site geographic boundaries to delete sample households located outside the EPZ. If a sub-region of the sample area has a very high incidence rate of residents living outside of the EPZ footprint, listed households will be sampled from that sub-region to ensure that residents will not be called outside the EPZ.

Length of Field Period

The length of time a CSEPP telephone survey remains active in the field is dependent on a number of factors including the sample design, sample size, and preference of the CSEPP site. Some CSEPP sites prefer the field period to be as short as possible in order to collect information directly after an outreach campaign. While some surveys have been fielded in as little as four days and others as long as eleven days, seven days is the average number of days a CSEPP telephone survey remains in the field.

    1. Estimation procedure

For a general SRS, the sample mean or proportion is used to estimate the population mean or proportion. If it becomes apparent that a certain sub-region or sub-regions of the SRS are either over-represented or underrepresented in the survey results, survey weights may be applied so that the sample population proportions are consistent with the true population proportions. In these cases, a weighted sample mean or proportion is used to estimate the population mean or proportion.

For stratified samples, a sampling weight as defined in current statistical literature is applied to the sample.3 The weighted sample mean or proportion is used to estimate the population mean or proportion.

If a sampling design other than an SRS or stratified sample is deemed to be appropriate for a certain CSEPP site, similar valid statistical procedures will be incorporated to estimate the population means and proportions.

    1. Degree of accuracy needed for the purpose described in the justification

CSEPP site-specific residential surveys employ random sample selection based on a 95 percent confidence level and a margin of error of ±3.4 percentage points

The National Public Affairs Integrated Process Team (IPT) has been agreeable to CSEPP sites electing to use sample designs such as a SRS, stratified sample, cluster sampling, or other more complex designs. However, participating sites have elected to use either an SRS or stratified sample.

The region surrounding each CSEPP site is divided into emergency planning zones (EPZs). Emergency planning zones are divided into the Immediate Response Zone (IRZ), which is the area closest to the Army installation, and the Protective Action Zone (PAZ), which is the area surrounding the IRZ. For example, a map of the Anniston Army Depot with surrounding EPZs is shown in Figure 1. The green zones are the IRZ and the orange zones are the PAZ. In some CSEPP sites, EPZs cross State and/or county boundaries. Often CSEPP sites are interested in summarizing and making comparisons of the survey results by IRZ/PAZ, State, and/or county. Some of these regions of interest have very small populations, resulting in few if any completed surveys from that region when employing a SRS. In order to make comparisons among regions or subgroups of the survey results, the Public Affairs IPT has required a minimum of 30 survey respondents per region/subgroup. Therefore, some sites have elected to use a stratified sample as opposed to an SRS.

Figure 1: Map of Anniston Army Depot Emergency Planning Zones

CSEPP sites have elected to stratify their EPZs by State, county, and IRZ/PAZ. A description of the strata for each site that has elected to use a stratified sample is shown below:

  • Pine Bluff has elected to stratify by county and by IRZ/PAZ. For example, there is one stratum for Jefferson County IRZ and one for Jefferson County PAZ.

  • Anniston has elected to stratify by county.

The simplest form of stratified sampling has been implemented where an SRS is taken from each stratum. The variance estimate for each stratum is calculated by using standard statistical formulas as shown below.

The estimated variance for , an estimate of a population proportion in stratum i where stratum i = 1, 2,…, n, is given by4

(1)

The estimated variance for the estimated population proportion of a stratified sample, , can be calculated by5

(2)

The estimated variance for the estimated population proportion of an SRS, , can be calculated by6

(3)

The design effect, deff, for a stratified sample over an SRS can be shown by7

(4)

For each CSEPP site that has elected to use a stratified sample, Table 3 provides the following information:

  • Name of each stratum

  • Universe for each stratum

  • Sample size for each stratum

  • Estimated variance, standard error, and margin of error for each stratum

  • Estimated variance for each stratified sample

  • Estimated variance for an SRS with the same number of observations as the stratified sample

  • Estimated design effect for each stratified sample

The variances in Table 3 were calculated with the statistical formulas shown above. In the CSEPP surveys the proportion, p, will be different for each question on the questionnaire. Therefore, in the calculations in Table 3 we used conservative estimates, = 0.5 and =0.5. The standard error, , is calculated as the square root of the variance and the margin of error, , is calculated as the standard error multiplied by 1.96.

The design effects for Anniston and Pine Bluff are slightly larger than one, indicating that an SRS may provide slightly more precision per observational unit. However, as a conservative measure we estimated each to be the same proportion, which yields an upper bound for the design effect. We would expect ‘s to be different proportions for at least some questions on the questionnaire which would yield lower design effects for those particular questions.

If a CSEPP site were to elect to use a complex survey design other than a stratified sample, standard statistical formulas for that precise sampling design would be used to calculate the sampling size, variance, standard error, and margin of error.

Table 3: Estimated Variance, Standard Error, Margin of Error, and Design Effect for Stratified Samples

CSEPP Site Name

Total Sample or Name of Strata

Universe:

Population over 18

Sample Size

Design Effect

Pine Bluff Arsenal

Total sample

139,878

1,093




0.000303

0.017410

3.4%

0.000227

1.335

Grant County IRZ

1,219

55

0.004401

0.066337

13.0%






Jefferson County IRZ

17,353

300

0.000823

0.028681

5.6%






Arkansas County PAZ

751

30

0.008298

0.091094

17.9%






Cleveland County PAZ

3,707

31

0.008309

0.091154

17.9%






Dallas County PAZ

525

30

0.008026

0.08959

17.6%






Grant County PAZ

10,977

86

0.002925

0.054081

10.6%






Jefferson County PAZ

44,676

218

0.001146

0.033857

6.6%






Lincoln County PAZ

3,171

31

0.008307

0.091143

17.9%






Lonoke County PAZ

6,642

52

0.004871

0.069793

13.7%






Prairie County PAZ

29

0

---

---

---






Pulaski County PAZ

12,878

95

0.002634

0.051325

10.1%






Saline County PAZ

37,950

165

0.001514

0.038905

7.6%






Anniston Army Depot

Total sample

228,909

961




0.000297

0.017238

3.4%

0.000259

1.146

Calhoun County

85,790

310

0.000806

0.028382

5.6%






Clay County

482

50

0.004573

0.067622

13.3%






Cleburne County

4,136

100

0.002464

0.049641

9.7%






Etowah County

61,048

221

0.001133

0.033667

6.6%






St. Clair County

48,327

175

0.001434

0.037862

7.4%






Talladega County

29,126

105

0.002388

0.048864

9.6%







Quality Assurance Plan:

To ensure that the highest quality of work is performed, a quality assurance plan is implemented in every survey process. The plan defines the relationships between IEM and the subcontractors and details the quality assurance activities used throughout the CSEPP survey effort.

Over the course of the survey work, CSEPP sites have chosen to modify and/or remove some of the core questions from their site questionnaire in order to produce a more customized survey instrument.

Prior independent research work on CSEPP sites conducted by the University of Arizona8 and by IEM has been reviewed in order to replicate successes and avoid shortcomings. Before each survey is conducted, IEM and the site’s Public Affairs Officer (PAO) carefully examine the questionnaire, review previous survey results, review the sites’ outreach campaigns and objectives, and make modifications where necessary. Questions found to yield inaccurate and/or unreliable results are eliminated and/or modified.

Core survey questions are reviewed for accuracy and reliability within each site and across all participating sites and labeled as either “Optional” or “Essential”. CSEPP sites are also able to add site-specific questions to the questionnaire. These questions must go through a review process before they are incorporated into the survey. Site-specific questions are provided to IEM project personnel who review them for validity, reliability, clarity of content, and question bias. Questions are modified as necessary, and final versions of the site-specific questions are approved by the appropriate site and then incorporated into the survey.

    1. Unusual problems requiring specialized sampling procedures, and

For surveys with particularly low response rates and a substantial suspicion of non-response bias, it may be necessary to collect an additional sub-sample of completed surveys from non-respondents in order to confirm if non-response bias is present in the sample and make adjustments if appropriate.

    1. Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

Program and survey objectives require annual collections.

  1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.

Open-Ended Questionnaire: To maximize response rates, respondents will be allowed one month to fill out and return the survey instrument, which is mailed with a postage pre-paid envelope for convenience. Reminders will be sent during the second and third week after the initial mailing if a response target of at least 50 percent is not initially met. This approach has proved effective in stimulating high response rate.

Site Surveys: Several measures, listed below, have been taken in order to maximize response rates for the telephone surveys:

  • At the beginning of the telephone call, respondents are told about the survey’s purpose, estimated response time, and who is sponsoring and conducting the survey.

  • The length of the survey phone interviews are kept to 15 minutes or less.

  • The survey interviewers are thoroughly trained.

  • The data collection company, CR Dynamics, is able to partially complete phone interviews and then call-back and finish the interview at a different time.

  • The telephone database provider company, Genesys, will use the most comprehensive screening tool available to increase the productivity of the sample numbers.

Control for Non-Response Bias: To avoid the possibility of under-representing a certain subgroup of the population, many CSEPP sites have chosen to use a stratified sample and/or over-sample a certain subgroup of individuals. If there is substantial suspicion of a non-response bias in the survey results, a sub-sample of the nonrespondents will be collected and analyzed. If a nonresponse bias is found in the analysis, we would use weights to adjust the data for nonresponse. From the analysis of the respondents and nonrepondents, we would determine the probability of responding to the survey for each person, which we will call Ai for person i. We would then calculate the probability that person i is measured in the survey, P (unit i selected in the sample and responds) = AiBi, where Bi is the probability that person i will be selected in the sample. The final weight for each person i will be:
.9

  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

Pilot Test

At the beginning of each site survey collection period, a pilot test was conducted to discover any potential problems with the survey instrument or process. To ensure that the call takers have received adequate training on the survey script, a minimum of one complete call per call-taker is conducted on the first day. For quality assurance purposes, CR Dynamics provides IEM the ability to monitor the live calls. Data from the first night is reviewed by IEM and improvements are made to the survey process as deemed necessary.

  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

Innovative Emergency Management, Inc.

Jack Long, Project Manager
2014 Tollgate Road, Suite 208
Bel Air, MD 21015
410-569-8191
410-569-9553 (fax)
[email protected]

Denise Moore, Technical Point of Contact
8550 United Plaza Boulevard
Suite 501
Baton Rouge, LA 70809-0200
225-526-8852
225-526-8920(fax)
[email protected]

Sangeeta Singh, Technical Point of Contact
8550 United Plaza Boulevard
Suite 501
Baton Rouge, LA 70809-0200
225-368-6765
225-526-8920(fax)
[email protected]

CR Dynamics & Associates, Inc.

Patty Ramos, VP of Operations
7 East Redwood Street, 6th Floor
Baltimore, MD 21202
410-347-5600, ext 203
410-347-5603
[email protected]

James Harris, MIS Director
7 East Redwood Street, 6th Floor
Baltimore, MD 21202
410.347.5600, ext 204
410.347.5603 (fax)
[email protected]

Genesys Sampling

Ashley Hyon
Marketing Systems Group
565 Virginia Drive
Fort Washington, PA 19034
215-653-7100
(215) 653-7114 (fax)
[email protected]

FEMA-Information Resources Management Branch, IC-Records Management

Nicole Bouchet
Statistician, Records Management Division
Office of Management
Federal Emergency Management Agency
Attention: OM-RM
500 C Street, SW
Washington, DC 20472
Tele: (202) 646-2814
Fax: (202) 646-3347

Government Performance Review Act of 1993: Public Law 103-6210


SEC. 4. ANNUAL PERFORMANCE PLANS AND REPORTS.

(a) BUDGET CONTENTS AND SUBMISSION TO CONGRESS- Section 1105(a) of title 31, United States Code, is amended by adding at the end thereof the following new paragraph:

(29) beginning with fiscal year 1999, a Federal Government performance plan for the overall budget as provided for under section 1115.’.

(b) PERFORMANCE PLANS AND REPORTS- Chapter 11 of title 31, United States Code, is amended by adding after section 1114 the following new sections:

Sec. 1115. Performance plans

(a) In carrying out the provisions of section 1105(a)(29), the Director of the Office of Management and Budget shall require each agency to prepare an annual performance plan covering each program activity set forth in the budget of such agency. Such plan shall--

(1) establish performance goals to define the level of performance to be achieved by a program activity;

(2) express such goals in an objective, quantifiable, and measurable form unless authorized to be in an alternative form under subsection (b);

(3) briefly describe the operational processes, skills and technology, and the human, capital, information, or other resources required to meet the performance goals;

(4) establish performance indicators to be used in measuring or assessing the relevant outputs, service levels, and outcomes of each program activity;

(5) provide a basis for comparing actual program results with the established performance goals; and

(6) describe the means to be used to verify and validate measured values.

EXECUTIVE ORDER 1286211



Title:Executive Order 12862: Setting Customer Service Stds.

Author: The White House

Date: 11 Sept 1993

Subject: Executive Order of Sept 11, 1993 Setting Customer Service Stds.

--------------------------------------------------------------------------------


THE WHITE HOUSE

Office of the Press Secretary



--------------------------------------------------------------------------------

For Immediate Release

September 11, 1993

EXECUTIVE ORDER

SETTING CUSTOMER SERVICE STANDARDS

Putting people first means ensuring that the Federal Government provides the highest quality service possible to the American people. Public officials must embark upon a revolution within the Federal Government to change the way it does business. This will require continual reform of the executive branch’s management practices and operations to provide service to the public that matches or exceeds the best service available in the private sector.

NOW, THEREFORE, to establish and implement customer service standards to guide the operations of the executive branch, and by the authority vested in me as President by the Constitution and the laws of the United States, it is hereby ordered:



Section 1. Customer Service Standards.

In order to carry out the principles of the National Performance Review, the Federal Government must be customer-driven. The standard of quality for services provided to the public shall be: Customer service equal to the best in business. For the purposes of this order, "customer" shall mean an individual or entity who is directly served by a department or agency. "Best in business" shall mean the highest quality of service delivered to customers by private organizations providing a comparable or analogous service.

All executive departments and agencies (hereinafter referred to collectively as "agency" or "agencies") that provide significant services directly to the public shall provide those services in a manner that seeks to meet the customer service standard established herein and shall take the following actions:


  • identify the customers who are, or should be, served by the agency;

  • survey customers to determine the kind and quality of services they want and their level of satisfaction with existing services;

  • post service standards and measure results against them;

  • benchmark customer service performance against the best in business;

  • survey front-line employees on barriers to, and ideas for, matching the best in business;

  • provide customers with choices in both the sources of service and the means of delivery;

  • make information, services, and complaint systems easily accessible; and

  • provide means to address customer complaints.


Sec. 2. Report on Customer Service Surveys.

By March 8, 1994, each agency subject to this order shall report on its customer surveys to the President. As information about customer satisfaction becomes available, each agency shall use that information in judging the performance of agency management and in making resource allocations.


Sec. 3. Customer Service Plans.

By September 8, 1994, each agency subject to this order shall publish a customer service plan that can be readily understood by its customers. The plan shall include customer service standards and describe future plans for customer surveys. It also shall identify the private and public sector standards that the agency used to benchmark its performance against the best in business. In connection with the plan, each agency is encouraged to provide training resources for programs needed by employees who directly serve customers and by managers making use of customer survey information to promote the principles and objectives contained herein.


Sec. 4. Independent Agencies.

Independent agencies are requested to adhere to this order.


Sec. 5. Judicial Review.

This order is for the internal management of the executive branch and does not create any right or benefit, substantive or procedural, enforceable by a party against the United States, its agencies or instrumentalities, its officers or employees, or any other person.


2 In order to make valid comparisons between sub-groups, the PA IPT has required that each sub-group have a minimum of 30 survey respondents.

3 Lohr, Sharon L. (1999) Sampling Design and Analysis. Brooks/Cole Publishing Company, p. 103.

4 Scheaffer, Richard L., Mendenhall William, and Ott Lyman (1979). Elementary Survey Sampling 2nd Edition. Boston, MA: Duxbury Press, p. 78.

5 ibid.

6 Lohr, Sharon L. (1999) Sampling Design and Analysis. Brooks/Cole Publishing Company, p. 35.

7 Lohr, Sharon L. (1999) Sampling Design and Analysis. Brooks/Cole Publishing Company, p. 239.

8 Williams, B., et al. Characteristics of Perceived Emergency Preparedness among Residents Living near the U. S. Army’s Chemical Weapons Stockpile Sites: A hierarchical linear model, 2000.

9 Lohr, Sharon L. (1999) Sampling Design and Analysis. Brooks/Cole Publishing Company, p. 265-266.

10 http://www.whitehouse.gov/omb/mgmt-gpra/gplaw2m.html

11 http://govinfo.library.unt.edu/npr/library/direct/orders/2222.html

1

File Typeapplication/msword
AuthorFEMA Employee
Last Modified ByFEMA Employee
File Modified2008-04-18
File Created2008-04-18

© 2024 OMB.report | Privacy Policy