1670-0040_NSCR__SS-B_final

1670-0040_NSCR__SS-B_final.docx

Nationwide Cyber Security Review (NCSR) Assessment

OMB: 1670-0040

Document [docx]
Download: docx | pdf

Supporting Statement B for Paperwork Reduction Act Submissions


Title: Nationwide Cyber Security Review (NCSR) Assessment


OMB Control Number: 1670-0040


B. Collections of Information Employing Statistical Methods.



  1. Describe (including numerical estimate) the potential respondent universe

and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.


Respondent Universe:


Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division in collaboration with MS-ISAC delivers a bi-annual summary report to Congress that provides a broad picture of the current cybersecurity gaps & capabilities of SLTT governments across the nation and MS-ISAC conducts a Nationwide Cyber Security Review (NCSR) assessment to inform the report. Therefore, the NCSR is open to all State, Local, Tribal and Territorial (SLTT) governments. This puts the universe of potential respondents at over 89,000.







Sampling Method(s):


MS-ISAC will not utilize a sampling method for the NCSR and will invite the entire population to participate.


Response Rates and Expected Number of Responses:


For the NCSR self-assessment and end-user survey, to estimate the number of respondents, we looked at past participation to forecast what participation in the next three years would be. We then took the average of the three year projection as our estimated annual respondents. This gave us an estimated 590 annual respondents. The below table presents the estimated number of respondents, based on historical data.



Self-Assessment Participation

Year

Percent of Participation

Number of Respondents

2011

0.18%

162

2013

0.34%

304

2014

0.28%

252

2015

0.41%

365

2016

0.52%

464

2017

0.53%

476

2018

0.60%

536

2019

0.66%

590

2020

0.72%

644

2018-2019 Average

590


The non-response survey will be sent to a 100 participants registered in the past and did not register or complete the most recent NCSR. The average response rate to the NCSR since 2011 has been 0.47%. As such, for this ICR, we anticipate a 1% response rate to this non-response survey and expect 1 annual response.


  1. Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,


Not applicable. The NCSR is open to 100% of the eligible audience; therefore no statistical methodology was utilized to determine stratification and sample selection.


  • Estimation procedure,


Not applicable. The NCSR is open to 100% of the eligible audience; therefore no statistical methodology was utilized to determine estimation procedure.


  • Degree of accuracy needed for the purpose described in the justification,


Not applicable.


  • Unusual problems requiring specialized sampling procedures, and

Not applicable.


  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


Offered annually from October – December


3. Describe methods to maximize response rates and to deal with issues of

non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.



In order to maximize response rates, the MS-ISAC does the following:

  • Sends electronic reminders to all previous year NCSR participants

  • Sends electronic reminders to anyone who’s registered for the NCSR in the past and has not completed a self-assessment.

  • If a bounce back is received and the organization is a MS-ISAC member, we will work with the organizations primary point of contact if applicable to track down the appropriate person to register for that organization on the portal

  • Sends electronic communication to the MS-ISAC membership announcing the NCSR is scheduled to open

  • Holds an annual Hot Topics on the previous year’s NCSR results

  • Holds an annual webinar on how to navigate the platform. All MS-ISAC members and NCSR end users are sent electronic communication advising them to attend.

  • Sends electronic reminders to previous year’s registrants, advising them the NCSR is opening. Utilizing returned e-mails to contact new users to tie to the organization to complete the self-assessment

  • The MS-ISAC holds monthly new member webcasts in which they highlight the NCSR

  • Panel Discussion at MS-ISAC annual meeting

  • MS-ISAC works with their following partners: Department of Homeland Security (DHS), National Association of Counties (NACo) and the National Association of State Chief Information Officers (NASCIO) to help promote the NCSR.


In addition, the final date of the NCSR may be extended past the December deadline in order to maximize participation for those organizations who have started the self-assessment but have not yet submitted their responses. Participation in the NCSR is voluntary; thus, organizations are not required to complete the NCSR.


Since the NCSR is a voluntary self-assessment, there are potential concerns with the reliability and accuracy of the data. However, the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) within the CISA Cybersecurity Divsion determined that the accuracy and reliability of the data is sufficient for its purposes. Since the NCSR was designed to gain a broad understanding of the cybersecurity posture of States, State Agencies, and local governments, intensive validation of data was not deemed necessary; self-reported data was considered sufficient to achieve this goal. In addition, because participation in the NCSR was voluntary, SECIR deemed that the respondents were unlikely to provide false or grossly inaccurate data.


The end-user survey and non-response surveys are not intended to provide statistically significant results, but rather will be used to gather feedback which will be used to make improvements to the NCSR self-assessment.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.



SECIR did not conduct formalized testing of procedures prior to the NCSR; however, it provided a demonstration of the NCSR tool to a group of potential respondents prior to the launch of the NCSR. As part of this demonstration, SECIR presented a sample version of the tool, displaying its functionality. The presentation instructed potential respondents on how to access and navigate the tool and how to complete and submit the NCSR questions. The presentation was designed to minimize the burden on potential respondents and ensure the tool functioned properly.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



Molly Gifford

MS-ISAC

[email protected]

(518) 880-0736



6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement B - Template
AuthorCorey Mull
File Modified0000-00-00
File Created2023-09-02

© 2024 OMB.report | Privacy Policy