1024-0254 Csap4 Ssa 1.25.2024

1024-0254 CSAP4 SSA 1.25.2024.docx

Comprehensive Survey of the American Public (CSAP4)

OMB: 1024-0254

Document [docx]
Download: docx | pdf

Supporting Statement A


Comprehensive Survey of the American Public – (CSAP4)
OMB Control Number: 1024-0254



Terms of Clearance: None


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.

This request is to reinstate OMB Control Number 1024-0254 for the NPS to conduct the fourth iteration of the Comprehensive Survey of the American Public (CSAP), previously named: “National Park Service Centennial National Household Survey.” The name change for this collection is included in this reinstatement. The purpose and intent of the name change is not limited to this collection but for future iterations.

The first Comprehensive Survey of the American Public was conducted in 2000 (CSAP1), the 2nd was conducted in 2006 (CSAP2), and the third was conducted in 2018 (CSAP3). The CSAP 1 and 2 addressed visitor and non-visitor perceptions, behaviors, and knowledge related to the services and opportunities offered by the NPS, as well as the demographic characteristics of visitors and non-visitors. CSAP3 continued to measure the same metrics however, the renewal added questions to understand the relevancy of the NPS after its 100th anniversary in 2016.

This request builds on the previous versions by identifying any changes and trends in the data, as well as including questions related to current legislation to reduce the maintenance backlog, protect critical resources, expand recreational opportunities, and focus on long-term sustainable operations for the next century (see: Great American Outdoors Act (GAOA) P.L. 116-152).

NPS policy mandates that social science research be used to provide an understanding of park visitors and the non-visiting public. To support this effort there is a need for a comprehensive survey of the American public to understand how the public views the NPS, including understanding park visitation, expectations of visits, and barriers to visitation. Furthermore, social science research in support of park planning and management is mandated in the NPS Management Policies 2006 (Section 8.11.1, “Social Science Studies”). This policy includes social science studies as a means to support the NPS mission to protect resources and enhance the enjoyment of present and future generations (National Park Service Act of 1916, 38 Stat 535, 16 USC 1, et seq.).

Legal Authorities:

  • 54 USC 100701 National Park Service and Related Programs: Protection, interpretation, and research in System (formerly The National Park Service Act of 1916)

  • 54 USC 100702 National Park Service and Related Programs; Research Mandate

  • NPS Management Policies 2006 (Section 8.11.1, “Social Science Studies”)

  • Great American Outdoors Act 2020, P.L. 116-152



2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

Data from CSAP1, CSAP1, and CSAP3 has been used by NPS managers at national and park-unit levels to monitor trends across the American public regarding their receptiveness and interest in leisure pursuits services offered, learning opportunities, and both identity and symbolism of parks as an integral fabric of U.S. history. Additionally, secondary analyses explored diversity, equity, and inclusion issues providing insight into the representation of domestic publics in park visitor use, serving as the benchmark to gauge changes in NPS visitor demographics. Lastly, questions in this iteration, that have remained across previous iterations, identify barriers and constraints to experiencing/visiting national park units for the American Public.

The combined results contributed to various education and information campaigns to increase accessibility and awareness for disengaged and underserved populations.

The primary purposes of this request are to:

(1) generate longitudinal data comparable with the findings from CSAP1, CSAP2, and CSAP3 for both visitors and non-visitors;

(2) measure public expectations of NPS facilities and services that were not included in previous iterations of CSAP; and

(3) understand the level of support of infrastructure investment.

How, by whom, and for what purposes the collected information has been and will be used are listed below for each of the three purposes.

The specific purpose of each survey section/question group is summarized in Table 2.1 below.

  1. Longitudinal Data: The questions in Sections: 1-3, 7, and 8 are used to identify trends in visitation, access, and barriers.

  2. Public Attitudes and Expectations: The questions in this section will reach both visitors and non-visitors to identify the broader public’s level of expectation for services (e.g., Wi-Fi, cellular service, transportation options, or infrastructure, etc.).

  3. Infrastructure Investment:. The questions in this section are intended to understand public support and how any potential investments may influence both visitation volume and the visitor experience.

The tables below displays the sections of the survey instrument and justifications for each section. For new sections and new questions, we have provided justifications in Table 2.1 and designated them with the NEW highlight to call attention to it being a modification to the survey instrument. Justifications for all question are also presented in the accompanying survey instrument.



Table 2.1. Summary and Purpose of Survey Sections and Question Groups

Section 1:

Questions in this section will be used to understand

Introductory Section

This information will identify if a respondent is eligible and willing/able to participate in a safe manner.

D3 NEW

What is your ZIP code at this residence?

________ (enter zip code)


Residency location information is important to understand as the survey is stratified by NPS region. Further, a more defined zip code location permits an increased geographic understanding of variation within a region.


Section2


Park Visitation (PV)

Questions in this section will be used to inform the types of programming and experiences offered on-site. By understanding visitor characteristics and motivations to visit, NPS unit managers can help plan for infrastructure improvements and new activities based on trends over time.


Section 3:


Non-Visitation (NV)

(will only apply to those respondents who do not self-identify as a recent park visitor)

The purpose of this section is to explore constraints or barriers that non-visitors may have to visiting areas management by NPS. National, regional, and individual unit managers need this information to develop/improve programs to increase opportunities for those currently self-identified as non-visitors. underserved populations.


Section 4


(NEW): Expectations (EX)

Questions in this section will be used to understand public expectations for infrastructure, including technology, visitor centers, and pathways national level.  Managers can use this data to explore whether adding more technological connectivity (e.g., Wi-Fi, cellular service), transportation options, or infrastructure is warranted.

EX1. The National Park Service manages a variety of different types of units. What types of national parks would you be most interested in visiting?

Allows segmentation for park managers based on park type (e.g., Historic, Recreation, or Nature).

EX2-14. Think about the type of national park you just identified as being most interested in visiting. How important are or would the following amenities, facilities, and services be to your decision to visit this type of national park?

These questions assess the level of importance of a variety of amenities, facilities, and services expressed by the public. NPS managers can use this information to understand if these features should be added to their park or improved within their park.

EX 15-27. Please tell us whether you strongly agree, somewhat agree, neither agree nor disagree, somewhat disagree, or strongly disagree with the following statements:

These questions will assess the expectations of the public as to the availability of a variety of amenities, facilities, and services in NPS units. In tandem with the previous set of questions, NPS managers can use this information to understand if these features should be added to their park or improved within their park.


Section 5

This is a NEW section. The questions will be used to understand…

(NEW): Great American Outdoor Act Support (GAOA)

respondents’ awareness and support for the Legacy Restoration fund (LRF), as well as the Land and Water Conservation Fund (LWCF).

GAOA 1: Please rate your level of support for repairing and maintaining park facilities, including campgrounds, picnic areas, roads, trails, electrical systems, water and wastewater systems, and other critical infrastructure.


GAOA 2-9: Please indicate how important repairing and maintaining the following park facilities are to your willingness to visit National Park units in the future.

Currently, in-park unit surveys do not routinely inquire about LRF-type projects. The questions in this section will be used to generalize the level of support for actions associated with the LRF. This information is needed to better understand the potential benefits of the program and increase opportunities for metrics for comparative evaluation and prioritization of proposed LRF-type projects.


Section 6

This is a NEW section. The questions will be used to understand…

Management (MAN)


MAN 1: What do you think is the most important issue facing our federally managed public lands? 

These questions will be used to identify areas of concern by the public. This information will help inform future planning efforts and expected public perceptions of management actions.

MAN 2-6. How concerned are you about the following conditions at national parks?

MAN 7-12. Please rate your level of support or opposition for the following management options for managing visitor use in national parks with high visitation.

These questions will help identify opinions and importance around potential management actions and gauge interest using possible (hypothetical) scenarios. This information will help inform future planning efforts, as well as inform general public perceptions of management actions.



Section 7

Questions in this section will be used to understand…

Leisure (LEIS)

the variations between visitors and non-visitors in their broader leisure activities and improve and expand activities offered at NPS units.



Section 8:

Questions in this section will be used to understand…

Demographics (D)

the diversity of park visitors and non-visitors. Demographic variables such as age, education, and knowledge are often good predictors of demand and visitation behavior.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden and specifically how this collection meets GPEA requirements.

This collection of information will be 100% electronic. All information will be collected using the Computer Assisted Telephone Interviewing (CATI) system. This system logs interviewer activity, schedules repeat calls, selects interviewees randomly, removes numbers from the call queue, reassigns calls to bi-lingual interviewers as needed, and produces operational reports. CATI permits direct electronic data entry (reducing processing, data entry error, time, and costs) thereby offering quick data turnaround.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

There is no duplication of efforts or information. Although the NPS conducts dozens of visitor use information collections per year, this is the only national survey funded by the NPS to improve their efforts for reaching a larger and generalizable sample including visitors and non-visitors.

Other federal recreation surveys, such as the National Survey of Hunting, Fishing, and Wildlife-associated Recreation (U.S. Fish and Wildlife Service) and the National Survey of Recreation and the Environment (U.S. Forest Service), provide information on outdoor recreation participation in general but do not provide information that can be used to understand the issues of relevancy and the public’s perception of the NPS.

5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.

This collection will have no impact on small businesses or other small entities. The survey will only target members of the general public.

6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

The consequences of not collecting this information minimize the NPS’ ability to make effective management and planning decisions without empirical data needed to provide an understanding of park visitors, the non-visiting public, gateway communities, and regions. Without CSAP, it would be impossible for the NPS its mandate to monitor trends over time. Longitudinal data from previous iterations of CSAP has been used to identify trends in visitation and access barriers. This information has allowed NPS managers to refine strategies for virtual visitation, as well as generate greater awareness of the possibilities for park visitation and understanding potential visitors’ expectations.

Further, requirements of Executive Order 13985 point to the need to advance equity across the federal government, not collecting rigorous, generalizable data relative to visitors and non-visitors would result in the NPS being unable to track trends regarding demographics and subsequently not meeting requirements outlined in E.O. 13985.

Further, the 2016 NPS Centennial Initiative called for an agency-wide commitment to reaching new audiences. The consequences of not collecting this information will be three-fold: 1) the NPS will continue to rely on outdated and anecdotal information to address the issues of non-visitation of under-representation of diverse groups, 2) the NPS will lack reliable information needed to represent the views and opinions of a new generation of visitors that will assist in planning efforts and 3) the NPS cannot continue to rely on the only comprehensive information of the national public that is more than 5 years old (i.e., CSAP3) to evaluate its relevancy among visitors and non-visitors.

The purpose of understanding how a representative sample of the American public, both those who visit parks and those who do not, is key for the agency to understand whether it’s meeting the needs of the people it serves. By only collecting data at individual parks, an entire demographic of taxpayers is not heard in terms of their satisfaction and possible thoughts on the services and lands the agency provides and manages. The agency needs to have data collected from non-users to remain relevant and accountable for how it’s performing over time.



7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

    • requiring respondents to report information to the agency more often than quarterly;

    • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

    • requiring respondents to submit more than an original and two copies of any document; requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

    • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

    • requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

    • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

    • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.

This request contains no special circumstances.

8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and in response to the PRA statement associated with the collection over the past three years, and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every three years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.

A Federal Register Notice published on November 10, 2022 (87 FR 67960) solicited public comment. No comments were received. In addition to the Federal Register Notice, seven individuals were asked to review the survey and provide feedback (Table 8.1).

Table 8.1: Reviewers

Position

Affiliation

1. Executive Vice President 

Survey Collection Firm/Call Center

2. Student (1)

Montana State University

3. Economist

Private Consulting, with Survey expertise

4. Social Scientist

University of Montana

5. Research Associate

University of Montana

6. Student (2)

Montana State University

7. Teacher

Monroe County Public Schools, FL

 

Whether or not the collection of information is necessary, including whether or not the information will have practical utility; whether there are any questions they felt were unnecessary.”  


Commenter #3. After reviewing the various reports from previous CSAP iterations and thinking about the potential for variation between Urban and Rural residents, it would be useful to capture respondent zip code or city to better capture geographic breaks.

NPS Response: Added a zip code question (D2) to increase opportunity for more detailed geographic analyses. An additional reviewer identified this comment as a quality addition when reviewing after the add.

Commenter #4. The findings relative to diversity, equity, and inclusion (DEI) outlined in previous CSAP reporting should be followed up to continue to monitor variation in participation by traditionally underrepresented groups. Additionally, the questions that had a split sample in CSAP 3 (PV59-63) appear to be minimally burdensome and are valuable. These should be asked of all visitors.

NPS Response: Maintained all potential DEI related questions (and added zip code). PV59-63 will be asked of all those qualifying as a visitor.


What is your estimate of the amount of time it takes to complete each form in order to verify the accuracy of our estimate of the burden for this collection of information?” 


Commenter #1: Given our history in collections, we would program this in and assume approximately 27 minutes to complete for the typical park visitor respondent and 22 by non-visitors. Variation may occur based on the specific answers given by the respondents.

Commenter #6 (general summary of burden times): Those reviewers/pre-testers who received the survey in a mock phone interview ranged in time from 23 to 28 minutes.

NPS Response: We estimate the burden will be approximately 25 minutes. 


Do you have any suggestions for us on ways to enhance the quality, utility, and clarity of the information to be collected?” 

 General comments from multiple commenters: The survey seemed to flow well. Multiple reviewers commented that additional introductory text of what is about to be asked in each section would allow the respondent to know the context of the question. This was noted for the GAOA questions (Section 5) specifically by three reviewers.

NPS Response: Additional text was added to ensure understanding of the sub-section about to be asked, including any needed definitions. For example, under Section 4: Expectations, the following was added prior to the first question, “The National Park Service is interested in understanding your expectations for facilities and services at national parks, even if you haven’t visited a park before.

 General comments from multiple commenters: Scale switching created some confusion. For example, some questions related to importance level and some related to support level.

NPS Response: Additional emphasis on prepping the respondent for the scales to be used has been applied. For example, for GAOA 2-9, respondents are provided the following prompt:

Please indicate how important repairing and maintaining the following park facilities are to your willingness to visit National Park units in the future.

Please rate each facility as Very important, Important, Not so important, or Not important at all.”



Any ideas you might suggest which would minimize the burden of the collection of information on respondents?” 

Commenter #1: I was reading this instrument on paper and the skips were difficult to follow, sometimes. Careful review of the skip patterns will keep response to just those for whom the question is intended.

NPS Response: The respondents will not need to skip themselves. The automated nature of the survey includes the skip pattern, and the interviewer will direct the interview.

Commenter #2: The series of questions regarding whether I used a program or service and then followed by how important it is when visiting, felt redundant and confusing.

NPS Response: While this is a hazard of this question series (PV20-PV41), it has been found to be useful and important to first identify use and then importance. In addition, this maintains alignment with previous iterations.

9. Explain any decisions to provide any payments or gifts to respondents, other than remuneration of contractors or grantees.

There are no payments or gifts associated with this collection.

10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

The degree of anonymity of responses will be described to respondents in the initial telephone script. The evaluation and statistical analysis of collected information will be conducted independently of personally identifiable information, and respondents’ names or contact information will never be connected to their reported responses. Personally identifiable information will only be accessible to the study team, except as required by law. The only personally identifiable information utilized will be phone numbers. These will be stored separately from the recorded responses of each respondent.

Data collected by the CATI system are stored within the call center’s secured network and offices only. The call center maintains stringent protocol covering the backup of all survey data, sample files, and other study-secured files. Data is not stored off-site or by any outside vendor. The call center’s corporate offices and in-house calling center are located in a “Class A” building located in the heart of the financial district of Honolulu. The building is monitored by professional/on-site security 24/7.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.

No questions of a sensitive nature will be asked as part of this collection.


12. Provide estimates of the hour burden of the collection of information. The statement should:

  • Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices. * If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

  • Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under item 13.

The combined respondent burden for this collection is 1,930 hours. The combined total number of responses is expected to be 10,578 (Table 12.1).

  • Household Survey: This will be a one-time telephone survey. Based on previous iterations of CSAP, we assume a 9% response rate. To obtain 3,500 completed surveys we will use a random sample of 38,892 telephone numbers (a combination of cell and landlines). Based on previous iterations and our current pretesting with 9 individuals, the overall time to complete the survey process will be about 25 minutes. This includes time to request participation, introduce the survey, read instructions, and complete the survey. The total burden for the telephone survey is 1,458 hours.

  • Non-response Bias Survey: During the initial phone contact, the interviewer will ask each respondent refusing to complete the full survey to complete 5 questions taken directly from the survey that will be used to measure non-response bias. The non-response bias questions are identical to previous iterations of the CSAP.

Based on previous CSAP iterations, we assume that of the 35,392 individuals who refuse to complete the survey, 20% (n=7,078) will agree to complete the non-response survey. It is estimated that the non-response surveys will take 4 minutes (n=472 hours). The burden for the individuals completely refusing to participate is not calculated here.

We estimate the total dollar value of the annual burden hours to be $84,785 (Table 12.1). This figure is based on the National Compensation Survey: Occupational Wages in the United States published by the Bureau of Labor Statistics Occupation and Wages, BLS news release USDL-23-25671 for Employer Costs for Employee Compensation—September 2023 Released December 15, 2023. The particular value utilized was $43.93/hour, which includes the multiplier for benefits for individual households.

Table 12.1. Estimated annual respondent burden

Activity

No. of Completed Responses

Completion Time

(minutes)

Burden
(hours)*

Hourly Rate Including Benefits

$ Value of Annual Burden Hours*

Household Survey

3,500

25

1,458

$43.93

$64,050

Non-response Bias Survey

7,078

4

472

$43.93

$20,735

Total

10,578


1,930


$84,785



13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information.
There is no non-hour cost burden to either respondents or record keepers nor are there any fees associated with the collection of this information.

14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.

The total annual cost to the Federal Government is $368,084. This includes the cost of salaries and benefits for administering this information collection $11,760 (Table 14.1) and operational expenses 356,324 (Table 14.2). We used the Office of Personnel Management Salary Table 2024-DEN2 to determine the hourly wages for the Federal employees associated with this collection. We multiplied the hourly wage by 1.6 to account for benefits following the same Bureau of Labor Statistics News Release referenced above.

Table 14.1: Annualized Federal Employee Salaries and Benefits

Position

Grade/
Step

Hourly Rate

Hourly Rate incl. benefits (1.6 x hourly pay rate)

Estimated time per task (hours)

Annual Cost

Project Manager

12/5

$52.50

$84

80

$6,720

Project Advisor

12/5

$52.50

$84

40

$3,360

Project Advisor

12/5

$52.50

$84

20

$1,680

Totals

140

$11,760


Table 14.2: Operational Expenses

Operational Expenses

Estimated Cost

Task 1: Study Design and Methods Development

  • Organize and schedule project scoping calls

  • Review Background Information

  • Develop Study Instruments

$37,685

Task 2: Data Collection and Management Plan

$96,558

Task 3: Survey Administration

$174,852

Task 4: Data Analysis

$19,710

Task 5: Reporting

$18,035

ODC: Direct Costs and Expenses

$9,484

TOTAL COST

$356,324


15. Explain the reasons for any program changes or adjustments reported.

This reinstatement of a previously approved collection will result in a combined net increase of 2,135 completed responses and 550 burden hours. The adjustment includes

  • net decrease of 11 hours and 165 responses associated with removing the Youth Engagement survey conducted as part of CSAP3.

  • net increase of 408 hours for the household survey due to an increase in survey completion time from 18 minutes to 25 minutes per complete response. This increase results in an additional 408 hours.

  • a net increase of non- respondents (n=2,300) causing an increase of 153 additional burden hours (Table 15.1).



Table 15.1. Summary of Program Change


Completed Surveys


Annual Respondent Burden Hours

Previous

Number of Responses Approved

Current Number of Responses

Requested

Change


Previous

Burden Hours Approved

Current

Burden Hours Requested

Change

Household survey

3,500

3,500

0


1,050

1,458

+408

Youth Engagement Survey

165

0

-165


11

0

-11

Non-Response Survey

4,778

7,078

+2,300


319

472

+153

TOTAL

8,443

10,578

+2,135


1,380

1,930

+550



16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

Data analysis and reporting will be developed and conducted at the national level, with cross tabs generated for identification of variation in responses between NPS regions. These reports will be published in accordance with the Natural Resources Report Series template publishing standards and be inclusive of descriptive statistics, frequencies and percentages, and averages of appropriate questions. Analyses of Variance (ANOVAs), t-Tests, and chi-square tests will be used to test for differences between groups and non-response checks as appropriate.

The collection will commence upon OMB approval. The implementation (e.g., data collection, cleaning, and analysis) is expected to last no more than 120 days. Draft and final reports will follow. Final publication timeline is based off of NPS’s publication office current queue at the time of submittal. Results will be presented to NPS staff after the final report is written as well.

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

We will display the OMB control number and expiration date on the information collection instruments.

18. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.”

There are no exceptions to the certification statement.

1 https://www.bls.gov/news.release/ecec.nr0.htm

2 https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2024/DEN_h.pdf

12


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPonds, Phadrea D.
File Modified0000-00-00
File Created2024-07-26

© 2024 OMB.report | Privacy Policy