Justification for USGS National Atlas User Survey

Justification Form-2012-USGS National Atlas.docx

DOI Programmatic Clearance for Customer Satisfaction Surveys

Justification for USGS National Atlas User Survey

OMB: 1040-0001

Document [docx]
Download: docx | pdf




U.S. Department of the Interior

Office of Policy Analysis





Justification for a Survey under the Department of the Interior’s Programmatic Clearance for Customer Satisfaction Surveys





Revised March 2012

Instructions for Completing Justification for DOI Programmatic Clearance Submission, OMB Control Number 1040-0001



  1. Survey Title/Date Submitted to the Office of Policy Analysis (PPA): Insert title for the proposed survey. Insert date that the expedited approval package will be submitted to PPA. Reminder: Please submit the package through your bureau/office Information Collection Clearance Officer.

  2. Bureau/Office: Insert the name of the bureau/office conducting the survey.

  3. Abstract: Summarize the proposed study with an abstract not to exceed 150 words.

  4. Bureau/Office Point of Contact Information: Complete the bureau/office contact information. PPA will communicate with the point of contact listed here throughout the entire approval process.

  5. Principal Investigator (PI) Conducting the Survey: Complete information about the PI who will be conducting the survey, if different than Point of Contact listed in #4. Otherwise note: Same as #4.

  6. Name of Program Office Conducting Survey: Provide the name of the bureau program, office, or organizational unit conducting the survey.

  7. Description of Customers and Services Provided: Provide a brief description of the customers who will be surveyed, the services provided by the program conducting the survey, and how these services are provided to customers.

  8. Survey Dates: List the time period in which the survey will be conducted, including specific starting and ending dates. The starting date should be at least 45 days after the submission date. The request for expedited approval, and submission of a complete and accurate approval package, must be made at least 45 calendar days prior to the first day the PI wishes to administer the survey instrument to the public.

  9. Type of Information Collection Instrument: Check the type(s) of information collection instrument(s) that will be used. If other, please explain.

  10. Survey Development: Explain how the survey was developed. With whom did you consult during the development of the survey on content? On statistics? Did you pretest the survey? What actions did you take to improve the survey? What suggestions did you receive for improving the survey? Which of the six topic areas will be addressed? (Note: A description of any pre-testing and peer review of the methods and/or instrument is highly recommended.)

  11. Survey Methodology: Explain how the survey will be conducted. Provide a description of the survey methodology including: (a) How will the customers be sampled? (if fewer than all customers will be surveyed); (b) What percentage of customers asked to take the survey will respond, and (c) What actions are planned to increase the response rate? If statistics are generated, this description must be specific and include each of the following:

- The respondent universe,

- The sampling plan and all sampling procedures;

- How the instrument will be administered;

- Expected response rate and confidence levels; and

- Strategies for dealing with potential non-response bias.

Note: Web-based surveys are not an acceptable method of sampling a broad population. Web-based surveys must be limited to services provided by the web site.

12. Total Number of Initial Contacts and Expected Number of Respondents: Provide an estimated total number of initial contacts and the total number of expected respondents.

13. Estimated Time to Complete Initial Contact and Time to Complete Survey Instrument: Estimate the time to complete the initial contact and the time to complete the survey instrument (in minutes).

14. Total Burden Hours: Provide the total number of burden hours. The total burden hours should account for the amount of time required to instruct the respondents in completing the survey, and the amount of time required for the respondent to complete the survey.

15. Reporting Plan: Provide a brief description of the reporting plan for the data being collected.

16. Justification, Purpose and Use: Provide a brief justification for the survey, its purpose, goals, and utility to managers. Specifically, describe how data will be tabulated and what statistical techniques will be used to generalize the results to the entire customer population. Describe how data from the survey will be used. Describe how you will acknowledge any limitations related to the data, particularly in cases where we obtain a lower than anticipated response rate. Note whether or not the survey is intended to measure a Government Performance and Results Act (GPRA) performance measure.


Justification for Submission under DOI Programmatic Clearance for Customer Satisfaction Surveys (OMB Control Number 1040-0001)


U.S. Department of the Interior

Office of Policy Analysis (PPA)

PPA Tracking Number: (for PPA use only)


CSS-12



Date Submitted to PPA:

April 16, 2012

1.

Survey Title:

USGS National Atlas User Survey

2.

Bureau:

U.S. Geological Survey



3.

Abstract: (not to exceed 150 words)


The National Atlas is an effort by the Federal Government to make authoritative maps and geographic information easier to find, get, and use. Its products and services are delivered online via nationalatlas.gov. The three primary customer segments for National Atlas products and services are professional users, educators, and public users. National Atlas management would like to conduct three research efforts (one for each customer segment). The National Atlas staff seeks to survey users to understand which information they seek, how they use Atlas maps and geographic information, and how the National Atlas should be improved. Topics include:


  • User profile information

  • Data needs

  • Perception and satisfaction of National Atlas Web Map services

  • Use and activity with base maps

  • Suggestions for improving National Atlas products and services

  • Preferred methods for receiving nationalatlas.gov information


Research results will help National Atlas with development of future products and services.









4.

Bureau/Office Point of Contact Information




First Name:

Jay




Last Name:

Donnelly


Title:

Editor and Manager, National Atlas








Bureau/Office:

U.S. Geological Survey, National Atlas of the United States


Street Address:

511 National Center, 12201 Sunrise Valley Drive, Room 2D200




City:

Reston

State:

VA

Zip code:

20192




Phone:

703.648.5395

Fax:

703.648.4165




Email:

[email protected]



5.

Principal Investigator (PI) Information




First Name:

Same as 4




Last Name:





Title:










Bureau/Office:





Address:





City:


State:


Zip code:




Phone:


Fax:





Email:









6.

Name of Program or Office Conducting Survey:

National Atlas of the United States



7.

Description of Customers and Services Provided:


The customers are users of the National Atlas website http://nationalatlas.gov








8.

Survey Dates

(mm/dd/yyyy)

to

(mm/dd/yyyy)




07/23/2012


07/23/2015



9.

Type of Information Collection Instrument (Check ALL that Apply)



_Y_Intercept

__Telephone

__Mail

_Y_Web-based

_Y_Focus Groups

__Comment Cards



__Other

Explain:





10. Survey Development:

(Who assisted in survey content development statistics? Was the survey pretested? How were improvements integrated? Which of the six topic areas will be addressed?)


A contractor, Pacific Consulting Group, developed the research instruments in collaboration with National Atlas staff. Questions focus on obtaining: customer profile information; information regarding current experience with National Atlas products and services; communication preferences; and suggestions for improving National Atlas products and services. No questions of a sensitive nature are asked. The primary purpose of this collection is for internal management purposes. There are no plans to publish or otherwise release this information. Prior to survey launch, the Contractor will internally test web survey program functions as intended, ensuring accuracy of skip logic patterns, and functionality such as going forward, back, and submit buttons function properly. The Contractor will also test the survey exactly as it will be viewed by the respondents—on multiple browsers, etc. Upon launch, the contractor will monitor the number of completed surveys on a daily basis for the web-based survey.


Names and contact information of individuals consulted with in developing the questions, including appropriate statistical analysis:


Jay Donnelly, Editor and Manager

National Atlas of the United States

511 National Center

12201 Sunrise Valley Drive

Reston, VA 20192

703.648.5395

[email protected]


Stephen R. Gillespie

U.S. Geological Survey

105 National Center

12201 Sunrise Valley Drive

Reston, VA 20192

(703) 648-5705

[email protected]


Ellen McNeil and Elaine Chan

Pacific Consulting Group

200 S. California Ave #200

Palo Alto, CA 94305

650.327.8108

[email protected]

[email protected]


This research is comprised of three related survey instruments tailored to key customer segments of nationalastlas.gov: a web survey for public users, a web survey (option for paper form) for professional users, and a focus group moderator’s guide for educators. All three survey instruments have been tested by a team of six Geographic Information Systems (GIS) staff and five non-GIS staff (through the contractor, Pacific Consulting Group). The professional user survey was also tested by 125 federal employees at the Environmental Systems Research Institute (ESRI) Federal GIS Conference. In response to their feedback, we modified the instructions and phrasing of some questions to improve clarity. Email addresses are asked of respondents who are interested in participating in future research. This information will facilitate more robust sampling and more efficient survey administration for future research efforts.


Topic areas addressed in the survey:

1. User profile - Demographic questions are included in order to create a profile of users

2. Delivery, quality and value of products, information, and services - The products include raw data, data documentation, and web map services provided by the National Atlas. Most of the questions included in the survey fall under this topic area.

3. Preferred communication methods

4. Suggestions for improvement

11. Survey Methodology:

(Use as much space as needed; if necessary include additional explanation on separate page).

Respondent Universe

The survey methodology varies by customer segment.


Professional Users

The respondent universe for this collection will be attendees of an ESRI-sponsored conference taking place on July 23-July 26, 2012. The estimated number of attendees at this conference is 14,000 mapping professionals. Approximately 40% of these attendees (5,600) will be invited to complete the survey.


Public Users

The respondent universe for this research will be visitors to the National Atlas website. The estimated annual number of visitors is 2.3 million. The website survey will be conducted continuously over three years.

Educators

7th to 12th grade educators include those who attend geographic system conferences. Based on prior conference attendance, the estimated total conference attendee population is 620.


Sampling Plan/Procedure

Professional Users

At the ESRI conference, professional users will be invited by National Atlas staff, using a convenience sampling method, to participate in the online survey by the National Atlas exhibit. They will have the option of completing the survey on their own computers or at computers made available by National Atlas staff in select areas of the conference venue. Approximately 40% of conference attendees are expected to visit the exhibit and receive the invitation to complete the survey. This figure is determined from the proportion of Federal users who completed a similar survey at a GIS user conference in February, 2012. The first few times we offer the survey, we wish to obtain responses from everybody to maximize the number of survey responses. To offer a truly randomized survey (to every nth conference attendee) requires staffing resources that are currently unavailable. Offering the survey only to every nth USGS booth visitor will limit the number of possible responses. However, after we establish a baseline, we will transition to a sampling method.



Public Users

A short web-intercept survey will be offered to every 20th visitor through pre-programmed software syntax. Those who agree to participate will be offered the survey upon exiting the nationalatlas.gov website. Software will select a sample of every 20th visitor to nationalatlas.gov. The method proposed is to select a random start and then apply a sampling interval.  The order in which visitors reach the site is presumably random, thereby producing a truly random sample. National Atlas staff will write sampling algorithms to select every 20th visitor. Website ‘cookies’ will detect which users have been invited to participate in the survey and not select those who have completed a survey within the past 90 days.


Educators

The research among educators will be exploratory in nature. A series of focus groups will be held to discuss topics outlined in the attached moderator’s guide. Focus group participants will be selected based on criteria set forth in the recruitment guide (7th – 12th grade educators who teach geography and have used nationalatlas.gov in the last six months). For the first administration of this survey, focus group participants will be selected from those who attend the ESRI educator’s conference. Participants will be recruited either by professional recruiters at focus group facilities, or through a listserv group of GIS educators. A total of eight focus groups, each with up to 10 participants will be held in four locations across the United States. A full group size of between eight and 10 participants is a general best practice in conducting focus groups. Groups that include more than ten participants are usually more difficult for moderators to manage.

Instrument Administration

This annual data collection will provide trending information to detect changes over time and a statistically-valid approach to understanding customer use of National Atlas products and services.


Professional Users

This survey will be offered initially at an ESRI-sponsored professional user conference in July, and repeated twice for a total of three times a year, at venues to be determined as of this writing. The information collection will be multimode – online and via paper. Participants who choose to complete the survey online will be asked to select a URL to access the questionnaire, either on their own computing devices or at laptops available at the USGS exhibit booths. Those who choose to complete the survey on paper will be asked to drop off their completed questionnaires at the USGS exhibit booth.


Introductory text will describe the survey and ask for their voluntary participation. The survey will be made available for the duration of the conference. Respondents will be given a printed map of the United States as a token of appreciation for their participation.


Public Users

Data will be collected through an online survey that will be presented to every 20th nationalatlas.gov website visitor. The survey will be offered to nationalatlas.gov visitors continuously over a three-year period. Visitors who are selected to participate will receive an invitation upon entering nationalatlas.gov. After agreeing to participate, users will receive the survey questions upon exiting nationalatlas.gov. The survey is estimated to take five minutes to complete.


Educators

This qualitative research will be repeated once for a total of two times a year at select events/conferences attended by the target audience or at focus group facilities. The focus group(s) will be administered in–person and held in dedicated meeting rooms available at the conference venues. Each group discussion will be approximately 90 minutes long. As the discussion guide is intended to be exploratory, most questions are open-ended in nature.


No stipend will be offered to participants of focus groups held at conference or event venues. However, stipends will be offered to participants that are asked to travel to professional focus group facility locations. The purpose of the stipend is to thank participants for sharing their time and contributions to the discussion. The amount of the stipend of $75 per participant is an acceptable amount currently used in the market research industry for focus group participants. Including a stipend significantly improves participation rates, which in turn, minimizes non-response bias, and improves the reliability to an extent beyond that possible through other means. Not offering a stipend for this type of research in a professional focus group facility setting is not likely to produce the desired level of reliability necessary to meet research objectives.


Expected Response Rate and Confidence Levels

Response rates will vary by survey:

  • For the professional survey, we anticipate a 40% response rate based on a similar survey distributed to Federal Employees at a GIS conference in February 2012.

  • The response rate for the public user survey is expected to be very low, about 1%, which is typical of online pop-up surveys in industry.

  • A very high response rate is expected for the educator focus groups, 80% response, due to the in-person recruiting method. This rate is based on previous focus group research.

Recognizing that response rates are likely to be low for the professional and public user surveys, the information is for internal use only and are exploratory in nature, and not to form the basis of any policy decision making.

Strategies for dealing with potential non-response bias

To maximize the response rate, the surveys are kept short and take between five to 10 minutes to complete. Questions are brief and easy to answer. To further improve response rates and decrease non-response error, the professional user survey will be offered using two data collection modes (paper and online). Offering the survey in multiple modes improves sampling coverage.


For the public website visitor survey, the questionnaire is presented only to a small sample of the visitor universe yet provides statistically valid information. Software will prevent users who have completed a survey within the last 90 days from receiving another survey invitation.


For the educator research, response rate maximization techniques include emphasizing the value of their feedback to the National Atlas and conveying benefits to educators. As with the professional and public user surveys, no personally identifiable information will be attributed to participants and their responses will remain anonymous.






Description of any pre-testing and peer review of the methods and/or instrument (recommended)

Professional Users

Many of the questions were tested by 125 federal employees who participated in a similar conference – the ESRI Federal GIS conference. In addition, six internal National Atlas employees and five non-employees tested the survey for clarity of questions and survey length. We incorporated their suggestions, edits, and comments in the final survey. The burden time of 10 minutes is established based on the time federal employee respondents spent to complete the survey.


Public Users

This survey was tested by six USGS employees and five non-employees that visited nationalatlas.gov.


Educators

These questions were tested by six USGS employees at GIS and five educators.



12.

Total Number of Initial Contacts and Expected Number of Respondents

Initial Contacts:

  • Professional Users: 5,600 (14,000 people attend the conference but only 5,600 will receive an invitation)

  • Public Users: 115,000 (every 20th user based on annual volume of users)

  • Educators: 104 (13 people per group x 8 groups)

Expected Number of Respondents:

  • Professional Users: 2,240

  • Public Users: 1,150

  • Educators: 83

13.

Estimated Time to Complete Initial Contact and Time to Complete Instrument

The total respondent burden on the public of the three related research instruments during this three-year approval period is estimated to be 3,665 hours per year.


Professional Users (1,398 hours per year)

  • 1 minute for initial contact (5,600 x 1 minute = 93 hours)

  • 10 minutes for questionnaire (2,240 x 10 minutes = 373 hours)

  • Subtotal = 466 hours per survey administration

  • Survey administered three times per year = 1,398 hours per year

Public Users (2,013 hours per year)

  • 1 minute for initial contact (115,000 x 1 minute = 1,917 hours)

  • 5 minutes for questionnaire (1,150 x 5 minutes = 96 hours)

  • Subtotal = 2,013 hours per year

Educators (254 hours per year)

  • 1 minute for initial contact (104 x 1 minutes = 1.7 hours)

  • 90 minutes participation (83 x 90 minutes = 125 hours)

  • Subtotal = 127 hours per survey administration

  • Survey administered twice per year = 254 hours per year


14.

Total Burden Hours

Contacts

Respondents

-----------------

Total

3,665 hours per year

15. Reporting Plan: A written report will be produced quarterly that summarizes the research findings for each of the three survey vehicles. The report will include descriptive frequencies and response rates. The report will also include detailed findings, an executive summary, and content analysis of the open-ended comments and recommendations for improvement priorities.



16. Justification, Purpose, and Use:

Survey Justification and Purpose

In January 2009, the Obama Administration released a memorandum to all Executive Departments and Agencies calling for the creation of a more transparent, participatory, and collaborative Government. The memorandum notes that Executive departments and agencies should solicit public feedback to identify information of greatest use to the public, determine how to best increase and improve opportunities for public participation in Government, and assess and improve the level of collaboration and cooperation between Government and nonprofit organizations, businesses, and individuals in the private sector. The Chief Technology Officer, General Services Administration, and OMB were charged with coordinating efforts to produce an Open Government Directive that would address specific actions for implementing the principles of a transparent and open Government. Use of ACSI surveys to provide reliable and statistically sound information directly supports improved performance, enhanced citizen involvement, openness and accountability.


USGS has not previously obtained user feedback on the usefulness and value of National Atlas products and services. The professional user survey, public user survey, and educator research will be the first study of its kind to be administered by the National Atlas. This research effort provide s important information about National Atlas user profile and needs with the objective of identifying how to prioritize strategies for improving National Atlas products and services. The National Atlas will be unable to develop the most valuable set of products and services without user feedback from these key customer segments.


Survey Goals

The goals of the three related National Atlas surveys are:


  1. Identify how customers currently use National Atlas products and services

  2. Identify ways for improving and communicating National Atlas products and services


Utility to Managers

Results of this analysis will be useful to National Atlas product managers in making decisions about future products and services and content delivery.

How will the results of the survey be analyzed and used?

The data collected from the two web-administered surveys will be coded directly into a computerized database. The data will be analyzed using statistical analysis software. Data analysis will include several phases. The first will consist of frequency distributions of responses to each question. Additional analysis techniques include cross-tabulation and analysis of variance to compare responses key customer segments, such as respondent’s job function, and analysis of verbatim comments from the open-ended questions, as well as the use of individual verbatim comments to illustrate key findings.


Data from the qualitative educator research will be summarized in text form with some descriptive statistics and verbatim comments.



How will the data be tabulated? What Statistical Techniques will be used to generalize the results to the entire customer population? How will limitations on use of data be handled? If the survey results in a lower than anticipated response rate, how will you address this when reporting the results? (Use as much space as needed; if necessary include additional explanation on separate page).


Data Tabulation Techniques: The data will be tabulated internally at National Atlas using statistical software (SPSS or SAS). Statistical techniques to be employed will include frequency distributions, mean score calculations, cross-tabulation, analysis of open-ended comments, and significance testing (ANOVA, t-test, chi-square test, etc).


Limitations: The professional survey will be limited to those attending professional conferences. The public survey will be limited to those who agree to participate in the survey. The educator research, being qualitative, will not produce findings that are statistically reliable due to small sample sizes and the qualitative nature of the discussions. Additional research of the National Atlas customer base will be recommended for additional statistical rigor.


Non Response Bias Check:


Checking non-response bias is most relevant for the professional and public user surveys, not the educator survey which is qualitative in nature with very low sample sizes, and for which we expect a very high response rate (therefore, low non-response bias). For the professional survey, we intend to handle non-response bias by collecting names of initial non-responders (individuals who are offered the survey but do not complete the survey at the conference), conducting follow-up research encouraging these individuals to complete the survey, then comparing responses between the two samples.



Is this survey intended to measure a Government Performance and Results Act (GPRA) performance measure? If so, please include an excerpt from the appropriate document. (Use as much space as needed; if necessary include additional explanation on separate page).


All USGS evaluation activities, including this study, are included under GPRA Section 4. GPRA explicitly reinforces the use of program evaluations to obtain objective measures of program results. This assessment serves to evaluate the performance of this program in order to provide quantifiable and measurable explanations for observed performance outcomes. Performance data provide useful and valuable information to program managers to improve program administration. Evaluations, frequently taking a broader and/or more in-depth approach to program structure and results, yield information that can lead to program improvement strategies and might address administrative changes based on methodologically sound grounds.



Checklist for Submitting a Request to Use DOI Programmatic Clearance for Customer Satisfaction Surveys


X All questions in the survey instrument are within the scope of one of the DOI Programmatic Clearance for Customer Satisfaction Surveys topic areas.



X The approval package is being submitted to the Office of Policy Analysis at least 45 days prior to the first day the PI wishes to administer the survey to the public.



X A qualified statistician has reviewed and approved your request.



X Your bureau/office Information Collection Clearance Officer has reviewed and approved the approval package.



The approval package includes:


X A completed Justification

X A signed Certification Form

X A copy of the survey instrument

X Other supporting materials, such as:

    • Cover letters to accompany mail-back questionnaires

    • Introductory scripts for initial contact of respondents

    • Necessary Paperwork Reduction Act compliance language

    • Follow-up letters/reminders sent to respondents



The survey methodology presented in the Justification includes a specific description of:

X The respondent universe

X The sampling plan and all sampling procedures, including how respondents will be selected

X How the instrument will be administered

X Expected response rate and confidence levels

X Strategies for dealing with potential non-response bias

X A description of any pre-testing and peer review of the methods and/or the instrument is highly recommended.




X The burden hours reported in the Justification include the number of burden hours associated with the initial contact of all individuals in the sample (i.e., including refusals), if applicable, and the number of burden hours associated with individuals expected to complete the survey instrument.


X The package is properly formatted (Word) and submitted to the Office of Policy Analysis electronically. 





CERTIFICATION FORM FOR SUBMISSION UNDER OMB CONTROL NUMBER 1040-0001


This form should only be used if you are submitting a collection of information for approval under the DOI Programmatic Clearance for Customer Satisfaction Surveys.

If the collection does not satisfy the requirements of the Programmatic Clearance, you should follow the regular PRA clearance procedures described in 5 CFR 1320.

  1. Bureau/Office Subgroup or Program


National Atlas of the United States

U.S. Geological Survey

  1. Title (Please be specific)

  2. USGS National Atlas User Survey

  1. Estimated Number

Contacts/ Respondents

Professional Users (3 times a year): Contacts: 5,600

Respondents: 2,240


Public Users:

Contacts: 115,000

Respondents: 1,150


Educators (twice a year):

Contacts:104

Respondents:83


Time per Response

Contacts

Respondents

Professional Users:

Contacts: 1 Minute

Respondents: 10 minutes


Public Users:

Contacts: 1 Minute

Respondents: 5 minutes


Educators:

Contacts: 1 Minute

Respondents: 90 minutes

Total Burden Hours

Contacts

Respondents

-----------------

Total

Contacts: 2,200 hours


Respondents: 1,465 hours


Total: 3,665 hours

  1. Bureau/Office Contact (who can best answer questions about content of the submission):

  1. Name

Jay Donnelly

Phone

703.648.5395

  1. Certification: The collection of information requested by this submission meets the requirements of OMB control number 1040-0001

Bureau/Office Qualified Statistician

Stephen R. Gillespie

  1. Economist, Reston, VA

DATE 03/22/2012

  1. Bureau/Office Information Collection Clearance Officer

  2. Shari Baloch

DATE 04/16/2012

  1. Office of Policy Analysis

  2. Donald J. Bieniewicz

DATE 07/06/2012

  1. OMB, Office of Information and Regulatory Affairs (OIRA)

DATE





1



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorpondsp
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy