REVISED scorecard Supporting Statement 1-17-13

REVISED scorecard Supporting Statement 1-17-13.doc

Clean Cities Plug-In Vehicle Community Readiness Scorecard

OMB: 1910-5171

Document [doc]
Download: doc | pdf

Supporting Statement:

U.S. Department of Energy

Clean Cities Plug-In Vehicle Community Readiness Scorecard

OMB Control Number 1910-XXXX


This supporting statement provides additional information regarding the U.S. Department of Energy’s (DOE) request for processing of the proposed information collection, Clean Cities Plug-In Vehicle Community Readiness Scorecard. The numbered questions correspond to the order shown on the Office of Management and Budget (OMB) Form 83-I, “Instructions for Completing OMB Form 83-I.”


  1. Justification


  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the information collection.


The Energy Policy Act of 1992 (EPAct) authorized the U.S. Department of Energy’s (DOE) creation of the Clean Cities program, through which DOE advances the nation’s economic, environmental, and energy security by supporting local actions to reduce petroleum consumption in transportation. 42 U.S.C. § 13255 (EPAct Section 505) The Clean Cities program operates within DOE’s Office of Energy Efficiency and Renewable Energy. As a national network of nearly 100 Clean Cities coalitions, the Clean Cities program brings together stakeholders in the public and private sectors to deploy alternative and renewable fuels, idle-reduction measures, fuel economy improvements, and emerging technologies. Under 42 U.S.C. § 13233 (EPAct Section 407), DOE is authorized to establish a data collection program for the purpose of collecting data which would be useful to persons seeking to manufacture, convert, sell, own or operate altenative fueled vehicles or alternative fueling facilities. The Clean Cities program uses these kinds of information to facilitate the deployment of alternative fueled vehicles and fueling infrastructure. The Clean Cities program has been in place for over 17 years, having successfully displaced over 3.7 billion gallons of petroleum fuel in the U.S. transportation sector.


The current administration has established a goal of having one million plug-in electric vehicles (PEVs) on the road by 2015. To achieve this goal, PEVs must provide a practical substitute for conventional vehicles, which in turn requires that communities provide access to infrastructure, the proper regulatory environment, and necessary support services. The Clean Cities program disseminates information communities need to enable greater adoption of electric vehicles.


The ICR is critical to the Clean Cities program’s capacity to facilitate market transformation in the electric vehicle sector. DOE has developed a “scorecard” tool to facilitate the routing of information regarding PEV readiness to communities. Employing a password protected web-based scorecard of multiple-choice questions accessed through DOE’s existing Alternative Fuels and Advanced Vehicles Data Center (AFDC), the tool will gather information communities submit and:

    • Assess progress towards a community’s readiness to host PEVs conveniently and efficiently; and

    • Facilitate further progress for participating communities, PEV owners, and EVSE users/managers.


On a voluntary basis, respondents, who DOE expects will be city/county/regional sustainability or energy coordinators, will supply the information via a user-friendly online interface to the questions they decide to answer. The online tool will translate the readiness measures across several weighted categories into numeric data to generate a “readiness score” depicted through a colored spectrum. The tool will allow users to track progress over time. Communities will see their own rating and may be compared to other cities, for ranking purposes only.


  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


DOE’s Clean Cities initiative has developed a voluntary scorecard to assist its coalitions and stakeholders in assessing the level of readiness of their communities for plug-in electric vehicles (PEVs). The principal objective of the scorecard is to provide respondents with an objective assessment and estimate of their respective community’s readiness for PEV deployment as well as understand the respective community’s commitment to deploying these vehicles successfully. DOE intends the scorecard to be completed by a city/county/regional sustainability or energy coordinator. As the intended respondent may not be aware of every aspect of local or regional PEV readiness, coordination among local stakeholders to gather appropriate information may be necessary.


The scorecard assessment effort will rely on responses to questions the respondent chooses to answer. The multiple-choice questions address the following topic areas: (1) electric vehicle supply equipment permitting and inspection process; (2) PEV and EVSE availability and numbers; (3) laws, incentives, and financing; (4) education and outreach; (5) utility interaction; and (6) vehicle and infrastructure planning. Respondents will provide answers through a user-friendly online interface. The answers will then be translated through a simple algorithm that will establish appropriate quantitative criteria, translating the readiness measures across several weighted categories into numeric data. Using a numberless color spectrum, a community will be rated against itself, with the colored spectrum results made available only to the respondent community. The total rankings will be normalized into a “score”, and communities will see their own rating and may be compared to other cities, for ranking purposes only.


The scorecard will use one information collection system, an online system. No other data collection system will be employed to support the scorecard. The online scorecard system DOE has developed provides several advantages. First, it avoids the need to download any forms or materials, though respondents may print out the full list of questions and answers, or a portion thereof if they wish. Second, avoiding downloads also limits potential security threats. Third, the designed system allows respondents to compare historical records, providing the opportunity to revisit the scorecard however often they like to track progress. Further, employing an online system also eliminates version control concerns, allowing for a single update to ensure that all scorecard users are using the current version.



The information collected would be reviewed to ensure accuracy in terms of information reported. If this information is not collected, DOE will have no way of determining the readiness of a community for adoption of PEVs, undercutting the capacity of the community and its many stakeholders (vehicle manufacturers, purchasers, businesses and fleets, among others) to have certainty as to the status of PEV readiness and whether the community has sufficient electric vehicle support equipment infrastructure to be considered ready for the deployment of PEVs. Moreover, the information will be the foundation for future program development.


  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses.


The scorecard will use one information collection system, an online system. No other data collection system will be employed to support the scorecard. The online scorecard system DOE has developed provides several advantages. First, it avoids the need to download any forms or materials, though respondents may print out the full list of questions and answers, or a portion thereof if they wish. Second, avoiding downloads also limits potential security threats. Third, the designed system allows respondents to compare historical records, providing the opportunity to revisit the scorecard however often they like to track progress. Further, employing an online system also eliminates version control concerns, allowing for a single update to ensure that all scorecard users are using the current version.


One hundred percent of the information would be reported electronically to a relational database (Oracle), which is accessible via the Internet and modem. The Oracle database is password protected. Reporting via Internet and modem reduces the respondents’ burden by allowing similar data to be entered with minimal changes on a computer form, instead of duplicating the majority of data on paper forms. The database system is an internal database that does not have a specific name.



  1. Describe efforts to identify duplication.


There is no duplication and this collection of information is specific to DOE. In making this determination, DOE has reviewed the extensive array of information available through the Clean Cities Program. No existing information is available with any degree of specificity or completeness for any or all of the potential voluntary respondents. Further to have DOE research and assemble any available information would render insignificant the opportunity the scorecard information collection affords respondent communities to receive feedback on their submittals. Above all, the scorecard also provides the unique ability for users to track their own progress over time. Moreover, DOE does not have the resources available to undertake the research and aggregation of such information for each of the potential 100 respondent communities.


  1. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


Though small businesses may serve in the role of respondent for a given community, no small businesses are expected to have to provide information as a result of this collection. Any small businesses that decide to participate in this voluntary information collection have the opportunity to limit the burden of the information collection by simply restricting the amount of effort they may undertake. Nonetheless, DOE’s deployment of the online, electronic EV scorecard has greatly reduced the burden from what it might be if the information collection were to be developed and submitted non-electronically. Moreover, DOE is not requiring that respondents develop and provide supporting documentation in support of answers provided as part of the online scorecard responses.


  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


The frequency of collection will be dependent on the respondents’ desire to submit the information, as it is a voluntary submittal. Respondents may choose to re-submit information annually if they have additional information to provide that may impact or improve their scorecard ranking. The initial submittal carries the greatest burden, after which any updates will be significantly less of a burden, as it will be relatively simple to re-submit the same information plus any updates to the information on an annual basis, or less frequent, as determined by the respondent. Reporting the information less frequently than annually, particularly if there is not an abundance of new information to report, will not have a consequence to the Clean Cities program’s efforts to facilitate the deployment of plug in electric vehicles. Nonetheless, the ICR is critical to the Program because in the absence of the requested information collection, DOE would have no means carry out its statutorily-mandated responsibility to collect and provide information useful to persons seeking to manufacture, convert, sell, own, or operate alterantive fueled vehicles or alternative fueling facilities.

Respondents participating in this program will submit only their responses to the questions.


  1. Explain any special circumstances that require the collection to be conducted in a manner inconsistent with OMB guidelines: (a) requiring respondents to report information to the agency more often than quarterly; (b) requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it; (c) requiring respondents to submit more than an original and two copies of any document; (d) requiring respondents to retain records, other than health, medical government contract, grant-in-aid, or tax records, for more than three years; (e) in connection with a statistical survey, that is not designed to product valid and reliable results that can be generalized to the universe of study; (f) requiring the use of statistical data classification that has not been reviewed and approved by OMB; (g) that includes a pledge of confidentially that is not supported by authority established in stature of regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; (h) requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.


The information collection is consistent with OMB guidelines.


  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken in response to the comments. Specifically address comments received on cost and hour burden. Describe efforts to consult with persons outside DOE to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or report.


The Department published a 60-day Federal Register Notice and Request for Comments concerning this collection in the Federal Register on March 30, 2012, volume 77, number 62, and page number 19275. The notice described the collection and invited interested parties to submit comments or recommendations regarding the collection. No comments were received.


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


There is no remuneration given for submission of any of the information other than the fact that the expense of responding is treated as an allowable cost.


  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


DOE will not share community-specific information with outside sources other than in response to a Freedom of Information Act request, and even then DOE would take precautions to respect the potential confidentiality of important data. DOE’s attention to potential confidentiality concerns usually arises not with regard to information submitted but rather with regard to any potential compliance issues that may be related to requirements associated with alterantive fuel vehicle deployment.


  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why DOE considers the questions necessary, the specific uses to be made of the information., the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


There are no collections in this package that involve questions of a sensitive, personal, or private nature.


  1. Provide estimates of the hour burden of the collection of information. The statement should indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, DOE should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample fewer than 10 potential respondents is desirable.


The estimate of hour burden of the information collection is as follows:


Total number of unduplicated respondents: 100

Reports filed per person: 1.00

Total annual responses: 100

Total annual burden hours: 2,050


The values for number of unduplicated respondents and total annual responses, set forth above, are based on the following:


        1. The total number of unduplicated responses (100) represents the number of online scorecards expected to be submitted to DOE under this voluntary program. This total number of completed online scorecards represents the expected number of participating communities, which is based on the number of existing Clean Cities Coalitions, and includes a few more to account for other communities that might potentially participate.

        2. The Reports filed per person (1.00) represents the one online scorecard to be submitted voluntarily on an annual basis. DOE expects one person to be completing only one online scorecard, and for only one community.

        3. The number of total annual responses (100) represents the sum of the number of on-line scorecards that might be submitted on an annual basis

        4. The total annual burden hours (2,050) represents the following: the product of the number of total responses by the number of hours for each reporting component (100 (on-line scorecards) * 20.00) + (100 (number of total responses) * 0.5) = 2,050. The research needed to compile the information needed to answer the on-line scorecard, as well as compiling the information, is estimated to take no more than 20.00 hours to complete. Completing the on-line scorecard is estimated to take no more than 0.5 hours to complete. Each Notice of intent to Apply for a Waiver for Alternative Compliance takes no more than. 0.1 hours to complete. The quantity of time DOE has estimated for the potential burden is based on DOE’s expertise in the subject matter and knowledge regarding the availability of the subject information, and the understanding the individuals completing the submittals will by virtue of the subject matter be themselves well versed in the subject matter. Moreover, DOE has designed the scorecard questions in a manner specifically designed to limit the potential burden.

        5. Also noteworthy, once the initial submittal has been completed, any subsequent voluntary updating of the information, which might be undertaken on an annual basis is estimated to take no more than 10 hours.


Average Maximum Burden:

Per Collection: 20.5 hours

Per Applicant: Each respondent would spend no more than 20 hours research and collecting information that would allow the respondent to complete the online scorecard, which itself would take 0.5 hours. Once the initial submittal has been completed, any subsequent voluntary updating of the information, which might be undertaken on an annual basis, is estimated to take no more than 10 hours.


.


DOE expects there to be approximately 100 entities participating in the program. This equates with 100 respondents, and then 100 responses, one for each entity/respondent. DOE expects that on a voluntary basis each of these respondents will respond once per year. DOE does not believe the number of respondents will increase, because the respondents will be representing geographic regions or areas that are tied to Clean Cities Coordinator regions, for which there are approximately 100. DOE does not expect to shut down this information collection at the end of three years, but rather hopes to continue the program as its relevance is expected to continue to be viable as communities continue to grow their PEV readiness.


  1. Provide an estimate for the total annual cost burden to respondents or recordkeepers resulting from the collection of information.


Beyond costs associated with undertaking the work, there are no additional costs to respondents other than the burden hours for reporting and recordkeeping. Costs to undertake the work are approximated at $47.73/hr of effort (http://www.bls.gov/oes/current/oes_nat.htm#11-0000), for a total of $978 in labor to research, collect, and respond to the voluntary collection. All information collection can be undertaken using a computer, telephone, and internet, for which each respondent is believed to have one currently (all clean cities coordinators operate using this basic technology). There is no special software required or other capital investment required to undertake this work.


  1. Provide estimates of annualized cost to the Federal government.


The cost of developing the Scorecard and associated database was approximately $60,000, and involved roughly 500 hours. The cost of ongoing effort on the part of DOE to undertake this work is approximately $24,000, involving roughly 100 hours.


  1. Explain the reasons for any program changes or adjustments reported in Items 13 (or 14) of OMB Form 83-I.


This is a new collection; therefore, there are no program changes or adjustments associated with this ICR.


  1. For collections whose results will be published, outline the plans for tabulation and publication.


The Office of Vehicle Technologies intends to publish or otherwise make available online aggregate statistics gathered from the responses without relating specific answers to specific communities. Key highlights or ideal scores may be detailed, serving as case studies, with explicit permission from communities.

Aggregate results will be published annually (at most) with the aim of conveying trends, gaps, and the effectiveness of DOE programs regarding PEV readiness in communities. Reports may be presented on the Alternative Fuel Data Center (AFDC) as well as at industry conferences.



  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons why display would be inappropriate.


DOE is not seeking approval to not display the expiration date for OMB. Expiration date display is not inappropriate.


  1. Explain each exception to the certification statement identified in Item 19 of OMB Form 83-I.


There are no exceptions to the certification statement on OMB Form 83-I.




8

File Typeapplication/msword
File TitleInstructions for the Supporting Statement
AuthorBRYANTL
Last Modified ByNREL
File Modified2013-01-18
File Created2013-01-18

© 2024 OMB.report | Privacy Policy