NCSR_Supporting Statement B_v1

NCSR_Supporting Statement B_v1.docx

Nationwide Cyber Security Review (NCSR) Assessment

OMB: 1670-0025

Document [docx]
Download: docx | pdf

Supporting Statement B for Paperwork Reduction Act Submissions


Title: Nationwide Cyber Security Review (NCSR)


OMB Control Number: 1670-NEW


B. Collections of Information Employing Statistical Methods.



  1. Describe (including numerical estimate) the potential respondent universe

and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.


Eligibility for the Nationwide Cyber Security Review (NCSR) included all Senior Information Technology (IT) personnel from State Governments, State agencies, and local governments to include Chief Information Officers (CIO), Chief Information Security Officers (CISO) and Chief Technology Officers (CTOs). This puts the universe of potential respondents at over 2,000. From that nationwide audience, roughly 10% or 206 individuals registered for the NCSR Secure Portal (housed on the US-CERT Secure Portal), with 162 completing the Review, for a completion rate of 79%. Below is a table showing the number of respondents of each type.


Respondent Type

Number of Respondents

States

44

State Agencies

58

Local Governments

60

Total

162


The NCSR had not been conducted previously; thus, there is no prior response rate.


  1. Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,


Not applicable. The NCSR was open to 100% of the eligible audience; therefore no statistical methodology was utilized to determine stratification and sample selection.


  • Estimation procedure,


Not applicable. The NCSR was open to 100% of the eligible audience; therefore no statistical methodology was utilized to determine estimation procedure.


  • Degree of accuracy needed for the purpose described in the justification,


Not applicable.


  • Unusual problems requiring specialized sampling procedures, and

Not applicable.


  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


There are no planned efforts to continue data collection. No future iterations of the NCSR are planned at this time.


3. Describe methods to maximize response rates and to deal with issues of

non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


In order to maximize response rates, NCSD sent electronic reminders to registrants (those who had expressed interest or signed up for the NCSR portal) who had not completed the Review. In addition, the final date of the NCSR was extended from November 15, 2011, to November 23, 2011, in order to give users more time to submit responses. Any non-respondents remaining after this date were not contacted. Participation in the NCSR was voluntary; thus, they were not required to complete the NCSR and did not need to be further contacted.


Since the NCSR was a self-assessment (i.e. respondents were reporting on their own cybersecurity programs), there are potential concerns with the reliability and accuracy of the data. However, NCSD determined that the accuracy and reliability of the data were sufficient for its purposes. Since the NCSR was designed to gain a broad understanding of the cybersecurity posture of States, State Agencies, and local governments, intensive validation of data was not deemed necessary; self-reported data was considered sufficient to achieve this goal. In addition, because participation in the NCSR was voluntary, NCSD deemed that the respondents were unlikely to provide false or grossly inaccurate data.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.



NCSD did not conduct formalized testing of procedures prior to the NCSR; however, it provided a demonstration of the NCSR tool to a group of potential respondents prior to the launch of the NCSR. As part of this demonstration, NCSD presented a sample version of the tool, displaying its functionality. The presentation instructed potential respondents on how to access and navigate the tool and how to complete and submit the NCSR questions. The presentation was designed to minimize the burden on potential respondents and ensure the tool functioned properly.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



Michael Leking

DHS NCSD

703-235-3030

[email protected]



3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement B - Template
AuthorCorey Mull
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy