Supporting Statement Part B

Supporting Statement Part B.docx

How States Safeguard Supplemental Nutrition Assistance Program Participant's Personally Identifiable Information (PII) (New)

OMB: 0584-0666

Document [docx]
Download: docx | pdf

Supporting Statement Part B: Collections of Information Employing Statistical Methods

B.1 Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


Populations Studied. The study targets Directors of the 53 SNAP State Agencies (SAs), which includes the agencies for each of the 50 States, the District of Columbia, Guam, and the U.S. Virgin Islands. The major data collection is a census of all SNAP SAs, so no sampling was needed. The study also includes semi-structured telephone interviews with five industry experts to obtain a broad view of best practices for protecting PII in general, and another set of interviews with Directors and/or their designees in five SNAP SAs deemed to be leaders in safeguarding PII.

A purposive (i.e., non-probability) sample of industry experts will be identified for the qualitative Industry Experts Interviews based on advice from FNS and 2M consultants. The goal is to include, if possible, some experts who have SNAP-specific experience, as well as some experts who work on cybersecurity for other SAs and federal agencies. Another group the study team will consider is faculty in university cybersecurity programs certified by the Department of Homeland Security (DHS). Throughout the selection process, the study team will use snowball sampling (asking each potential expert if they could recommend others with relevant experience) as a secondary approach for identifying additional members of the purposive sample of industry experts. The names of industry experts identified via snowball sampling methods will be reviewed by the study team, FNS, and 2M consultants to ensure sufficient diversity among the purposive sample.

Selection of exemplary SAs will occur after the web survey is completed and initially analyzed, and after the interviews with industry experts are completed. FNS will use these data and professional opinions of FNS staff to select five SAs for the State leader interviews. For States in which several staff worked with the SA Director to complete the web survey, the plan is to encourage the SA Director to invite all relevant staff to participate in the follow-up interview, which will be conducted by group conference call.

Expected Response Rates. The response rate refers to the proportion of sampled SAs that complete the SA survey. The SA survey is a census of all 53 SNAP SAs. Based on prior experience with SA surveys conducted for other studies, the study team aims to achieve a SA response rate close to 100 percent.

To achieve full participation, we will follow a multistep process, beginning with notifying key stakeholders about the study through well-established FNS communication channels, followed by these steps:

  • Offering a user-friendly web interface for the survey

  • Providing email and telephone support

  • Supplying email and telephone reminders for up to 4 months

  • Using survey management software for the web survey that allows respondents to save and exit the survey at any time and return to complete the survey at a later time.

If the team does not achieve at least an 80 percent response rate during the data collection period, FNS will contact the unresponsive SAs in their region and encourage participation. This process may extend the proposed data collection period, but in previous studies it has been effective in achieving an 80 percent response rate or better.

B.2 Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

The information will be collected via web surveys of SNAP State Directors. The respondents will have 14 weeks to complete a survey, which allows time to plan their approach. Most SA Directors are likely to ask other staff members to complete various sections of the survey, which will be facilitated by forwarding the survey link to the respective staff members. Respondents will receive reminder emails and calls from trained survey support personnel. Respondents may also call and/or email for help with technical issues or questions. If respondents desire, they can complete some questions on the phone, as the trained support personnel have access to the respondent’s survey during phone calls.

Statistical Methodology for Stratification and Sample Selection. Sampling will not be conducted because the survey will be administered to the universe of SNAP SAs. If the response rate falls below 80 percent, 2M will adjust for nonresponse. Sampling will not be conducted because the survey will be administered to the universe of SNAP SAs. If the response rate falls below 80 percent, the study team will conduct a nonresponse analysis to determine whether nonresponse bias potentially exists (as required by OMB). To conduct the analysis, the study team will carry out the following steps:

  1. Code sampled State Agencies as respondents or nonrespondents.

  2. Utilize State Agency characteristics such as agency size, FNS region, whether agencies are part of a state- or county- administered system, and other pertinent information.

  3. Use a logistic regression model to identify subgroups that are significantly different between respondents and nonrespondents.

  4. Report the model results and whether they suggest potential nonresponse bias, should the need arise.


Although no statistical methods were used to select semi-structured interview respondents for the study (either from leading States or from industry experts), the study team will include a range of States, with differing populations, caseloads, and regions of the country, for the State leader interviews. Similarly, industry experts may include local or national experts, those who work with both public and private entities, and those who are involved in training cybersecurity professionals.

Estimation Procedure. Because data from the web survey will be a census, the data will be presented in straightforward descriptive statistics, along with some bivariate analyses of associations between variables. Examples include comparing safeguarding practices in big versus small States and in States with county-administered programs versus programs that are centrally administered.

Degree of Accuracy Needed for Purpose Described in Justification. Not applicable, as the data collection is a census of SAs.

Unusual Problems Requiring Specialized Sampling Procedures. We do not anticipate any unusual problems requiring any specialized sampling procedures.

Any Use of Periodic (Less Frequent Than Annual) Data Collection Cycles to Reduce Burden. This data collection will be the first time that data on this topic have been collected from all SAs. Concern regarding the periodicity of data collection cycles is not applicable.

B.3 Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Achieving the specified response rate involves contacting the SAs, securing their participation in the survey, and then offering support and completion reminders using the procedures described below. We expect 100 percent of State SNAP Directors to complete their surveys.

The following are procedures for maximizing the number of respondents to the surveys:

  • The letters inviting State SNAP Directors to participate were carefully developed to emphasize the importance of this study and how the information will help FNS better understand and address current policy issues related to safeguarding PII.

  • The current contact information will be used for initial correspondence and will be updated as needed throughout the data collection period to facilitate communication with the respondents.

  • A toll-free number and study email address will be provided to all participants so that State SNAP Directors (or designees) can receive assistance with the survey.

  • State SNAP Directors or their designees will have the option of completing the web-based survey via telephone with a trained survey support professional who will enter information into the web survey.

  • Periodic email reminders will be sent to State SNAP Directors who have not yet completed their surveys.

  • The study team will follow up by telephone with all State SNAP Directors who do not complete the survey within a specified period and urge them to complete the survey.

  • The study team will use call-scheduling procedures designed to call numbers at different times of day (between 8:00 a.m. and 6:00 p.m. ET) and days of the week (Monday through Friday) to improve the chances of finding a respondent at work.

  • To reach the SAs that are most distant from the east coast of the United States—Guam and Hawaii—may require an extra day. We anticipate most information exchanges would be by email, but we will also work with SA staff to set up telephone meetings, if needed.

  • The semi-structured interviews with industry experts and with SA staff who are leaders in protecting PII will be exploratory, with only five of each type of interview conducted and no claim of representativeness of the samples. As noted above, the study team will attempt to include variation by region and caseload size, among other factors. Exploratory samples are used to elicit information about issues and innovations not currently known to FNS.



B.4 Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Pretesting of the web survey and the interview protocols has been completed. We completed four pretests of the State Director survey, two pretests of the SNAP SA leaders interview protocol, and one pretest of the industry experts interview protocol. Pretest respondents consisted of State SNAP Directors and staff, and cybersecurity or State systems experts in private industry. For those SAs that participated in the pretest of the web survey, 2M will arrange that their pretest responses be automatically loaded into their OMB-approved web survey, so all the participating SAs would need to do when the full survey is released is to review and update the survey responses as appropriate.

The State Director survey and the SNAP SA leaders and industry experts protocols were revised to incorporate pretest results. Revisions included adding definitions to some terms, incorporating branching language and questions to reflect the different operating contexts for a subpopulation of 10 states, and modifying the language of the associated emails and phone scripts to improve clarity. Full pretest results are included in Appendix G.1 of this package. The revised versions of the survey and interview protocols incorporating the feedback from pretest respondents are included in Appendices B.1 and B.2, C.1, and D.1.



B.5 Provide the names and telephone numbers of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Table B.1 presents a summary of individuals consulted on statistical aspects of the design. In addition to soliciting comments from 2M leadership, the study team, and the project’s subject matter experts, FNS had this information collection request reviewed by Beth Schlein from the National Agricultural Statistics Service (NASS) for an expert assessment of the study design and methodology. NASS comments and the FNS response to NASS comments are available in Appendices E.2 and E.3, respectively. 2M will conduct data collection and analysis for this study, in coordination with FNS.

Table B.1 Persons Consulted on Statistical Aspects of the Design

Name

Affiliation

Telephone Number

Email

Anne Gordon

2M

817-856-0891

[email protected]

Nicholas Beyler

2M

703-214-0931

[email protected]

Ann Collins

Consultant to 2M

617-455-2104

[email protected]

Steven Garasky

2M

817-856-0876

[email protected]

Hiren Nisar

2M

703-214-1211

[email protected]

Dallas Elgin

2M

703-214-1004

[email protected]

Beth Schlein

NASS/USDA

--

[email protected]



OMB Number: 0584-#### 1231981BF0081 | Appendix G-v

Expiration Date: ##/##/####

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSNAP PII: Office of Management and Budget Information Collection Review Package
Subject1231981BF0081
AuthorAndrés Romualdo, MA
File Modified0000-00-00
File Created2021-02-06

© 2024 OMB.report | Privacy Policy