B. Collections of Information Employing Statistical Methods
Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
Respondent universe is all existing VIPR vendor users. All users in this universe will be invited to respond. We don’t anticipate using sampling. All responses received will be used in data analysis. The Forest Service is preparing for an 85% response rate. We will use e-mail reminders, notices to websites, and user forums to promote participation in this survey.
Responses received as a result of this survey will be summarized on bar charts and/or graphs (see A.16 above for an example of a bar chart) that would show the number of responses received for each question. Because of their nature, expanded comments will not be disseminated outside of the VIPR core team. Any expanded comments received will be gathered into a summary document and then reviewed and analyzed by VIPR management to determine if the nature of the comments requires the allocation of VIPR resources to resolve them.
This data analysis will identify those issues most important to our vendor community, and will drive future VIPR enhancement decisions. For example, if the survey results indicate that the user guides are not useful, resources would be allocated to revise the user guides to make them more useful. If survey results indicate that the Helpdesk has not been able to resolve issues, then VIPR resources would be charged with providing additional training and oversight of the Helpdesk. In addition, if the survey results indicate that there may be performance issues with the application itself, then developer time will be allocated to resolve this issue. All decisions regarding level of attention to be paid to each issue is dependent on the results.
COMMUNICATION PLAN
The communications planned for this survey are as follows:
Pre-Deployment Communication - Invitation to Participate in VIPR Survey. Sample Survey introduction e-mail is as follows:
In
2009, Forest Service deployed the VIPR system to support the new
business process for managing pre-season incident agreements. The
use of VIPR is critical to the agency's ability to suppress wildfires
efficiently and effectively while managing incident support costs.
We request that you participate in this survey to
identify ways in which we can improve the VIPR system and support the
Forest Service provides to the vendor community that uses the system.
Your feedback will help us to better meet agency needs by
serving the Forest Service mission.
During the week of
(date TBD), you will receive an e-mail requesting that you complete a
20-minute anonymous survey from Survey Monkey and the hyperlink to
the survey website. The Forest Service is asking the vendors
who use VIPR to complete the VIPR Survey by midnight on (date TBD).
The Forest Service takes all feedback seriously and the
results of this survey will aid in our overall support of our vendor
community and the VIPR program. We you’re your input.
Thank
you in advance for taking the time to participate in this survey.
Deployment Communication – VIPR Survey Open / Survey Link. Sample Survey link e-mail is as follows:
The
Forest Service is seeking your input to identify ways in which we can
build on the VIPR Program success. Your feedback will help us
identify ways in which we can improve the VIPR system and support the
Forest Service provides to the vendors who use the system.
The
Forest Service takes all feedback seriously. Your input will aid
Forest Service in our overall continuous improvement process. This
survey is anonymous and takes less than 20 minutes to complete. You
have until midnight (date TBD) to complete the survey.
If
you are having technical difficulties please contact the VIPR team by
sending an e-mail to [email protected]
.
The
survey is by invitation only, so please do not forward this e-mail to
anyone else. Thank you in advance for taking the time to participate
in this process.
Operational Communication – Survey Instructions. Sample instructions are as follows:
Header: Thank you for taking the VIPR Survey.
Body: As you move through the questions, please note the following:
This survey is anonymous
Asterisks mark required questions
The progress bar at the top of each page shows the percent of the survey which you will have completed by answering the questions on the current page.
While not required, the Forest Service asks that you take the extra time to provide the reasons behind your responses in the comment boxes at the end of each section.
We look forward to receiving your feedback!
Operational Communication – Reminder E-mails. The reminder e-mail will come from Survey Monkey and will only be sent to participants who have not responded to the survey. The respondents will have at least 4 weeks to respond to the survey. A reminder message will be sent at the 2-week mark and at the 3-week mark. If we need to boost the participation in the survey we will send a third reminder out 2 days before the survey closes. Sample reminder e-mail is as follows:
We have not yet heard from you! The Forest Service takes customer feedback very seriously and is using the results of this survey to improve our support to you. Now is the time to share your thoughts.
The deadline for completing the survey is midnight on (date TBD). The survey takes less than 20 minutes to complete. Thank you for taking the time to help us improve the VIPR system.
As shown in B.1 above, the responses received will be used for management decision-making. However, should the overall response rate remain below 50% at the 3-week mark then we will attempt to contact a random number of vendors by telephone to determine the reason for non-response, and to encourage participation.
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
There is no down select/sampling for this information collection. All existing VIPR vendor users will be invited to participate.
Estimation procedure,
We have metrics from the existing VIPR vendor user base that is used to estimate the number of participants for this information collection.
Degree of accuracy needed for the purpose described in the justification,
The closer we are to achieving an 85% response rate the more information we will have to ensure future system enhancements and program support improvements are beneficial to the users.
Unusual problems requiring specialized sampling procedures, and
No unusual problems are anticipated that would require specialized sampling procedures.
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
We will be using an annual collection cycle and anticipate this will be the best option to meet the objectives of the information collection and reduce public burden.
Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.
There is no down select/sampling for this information collection. All existing VIPR vendor users will be invited to participate. Respondent universe is all existing VIPR vendor users. All users in this universe will be invited to respond. All responses received will be used in data analysis. We will use e-mail reminders, notices to websites, and user forums to promote participation in this survey.
Participants will not be required to submit a response to this information collection. The Forest Service will use the data collected for:
System and program management and support evaluation,
To obtain feedback on system and program support successes,
To obtain feedback on potential system and program support improvements
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
The FS conducted a test survey to minimize burden on the public, improve utility of the information collection, and ensure the survey questions are clear and concise. Nine VIPR vendors were randomly selected to participate in the test survey using the Survey Monkey tool; they were given two weeks in which to respond to the test survey. The following nine vendors were asked to take a similar survey as will be approved under this information collection.
Vendor Name |
E-mail address |
Timmermann Wildland Fire Services, LLC |
|
Northwest Timber Fallers |
|
Tracy Porter |
|
Steve Paillon Water Truck Service |
|
Drew Hendrickson |
|
Northern Columbia Reforestation, LLC |
|
Contract Water Wagons |
|
Curfman Showers & Potable Water LLC |
|
Joe Vicini, Inc. |
We received 4 responses, or 44.44%; it was discovered after the test survey period that one of the vendor’s email addresses was incorrect – therefore, the response rate was actually 50%. We feel that the timing of the test attributed to the low response rate – the test was conducted at a time when vendors are gearing up the coming fire season, and have little time to do anything else. In addition, the test survey involved less than 0.5% of the total audience who will be invited to participate in the formal survey; given that our vendor pool extends nationwide, it was extremely difficult to randomly choose nine vendors for the test survey with whom we had some confidence of a response. The average time to complete the survey was 16.75 minutes. There were no comments regarding the clarity of the questions. Two comments in total were received regarding the length and content of the survey; both comments related to the VIPR system itself and not the actual survey.
In response to the question ‘Do you have any comments about the length of this survey?’, the only comment received was: “Think you are barking up the wrong tree. The problems with Vipr have not been with the support system other than the tutorial or self help system sucks. You have to know something to really get any use from these helps. The problems have been getting connected and staying in the system.”
In response to the question ‘Do you have any comments about the questions or content of this survey?’, the only comment received was: “As stated earlier, I have used VIPR only twice. I have forgotten some of the steps in the system. The guides have answered any questions/problems I have had.”
VIPR management collaborated for a few weeks to develop and review the questions that were ultimately used in the test survey. The questions were designed to be usable in an annual survey, but at the same time provide the necessary vendor feedback for future VIPR development plans. Because of this and the fact that the test resulted in no relevant comments regarding the content and length of the survey itself, it was felt that there was no need to change the survey strategy or the final survey. If the final results of the formal survey prove disappointing in lack of comments and/or lack of response, VIPR management may then review the survey strategy to determine if there is a better way to conduct these annual surveys.
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
Name |
Agency/Unit |
Phone Number |
|
Ron Wester |
US Forest Service |
703-605-4665 |
|
Larry Bowser |
US Forest Service |
970-295-5808 |
|
Cheryl Emch |
US Forest Service |
541-902-3157 |
|
Ivory Carr |
US Forest Service |
208-765-7272 |
|
Terry Kiele |
US Forest Service |
970-295-5820 |
|
Juli Dixon |
Booz Allen Hamilton |
703-377-0329 |
|
David Hancock |
USDA - NASS |
202-690-2388 |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | cparker |
File Modified | 0000-00-00 |
File Created | 2021-02-02 |