Revisions to the OMB ICR Package based on Comments from the National Agricultural Statistics Service
This document provides an overview of revisions made to the draft OMB ICR Package, based on comments provided by the U.S. Department of Agriculture’s National Agricultural Statistics Service (NASS). The table below lists the comments provided by NASS alongside 2M’s responses describing the revisions that were made.
Comment # |
NASS Comment |
2M’s Response |
|
In Supporting Statement Part A, Table A.3 shows that 10 Industry Experts and 7 SA leaders will be initially contacted to participate in the study and after a number of follow-ups, there is an expectation that there will be 5 Industry Experts and 5 SA respondents. Yet for the 53 SAs, it is anticipated that nearly 100 percent of the SAs will participate. Since pretest findings described in Appendix G.1 did not show close to a 100 percent response, perhaps, a more detailed explanation of why there is a difference in expectations for the different groups, especially since the study is not mandatory for any of the respondents. |
To address the reviewer’s concerns, we’ve incorporated the following text under Section A.12 (and immediately before table A.3):
“Different response rates are anticipated for the respondents participating in the study’s three modes of data collection. The SA Survey is a census of all SNAP SAs, and a 100 percent response rate is anticipated on the basis of FNS’s prior experience with conducting SA surveys for other studies, as well as the study’s strategy for recruitment and survey completion (as detailed primarily in Section B.3 in addition to Sections A.11, A.12, B.1, and Appendix B). In contrast, it is anticipated that SAs may be less inclined to participate in the subsequent State Agency Leaders Interviews due to the additional time requirements, while industry experts are anticipated to have less of an incentive to participate in the Industry Expert Interviews. Accordingly, 10 industry experts and 7 SAs will be recruited to participate in the respective interviews to ensure that a minimum of 5 industry experts and SAs participate.” |
|
Supporting Statement Part B briefly mentions the use of a non-response adjustment factor if the response rate falls below 80 percent, but no details regarding the methodology are provided. States with fewer resources may be less likely to respond, so how will data/information regarding these states/territories be covered and included. |
We have incorporated the following text into Section B.2:
“If the response rate falls below 80 percent, the study team will conduct a nonresponse analysis to determine whether nonresponse bias potentially exists (as required by OMB). To conduct the analysis, the study team will carry out the following steps:
|
|
A more detailed plan of how experts will be selected for the survey, including how to ensure good coverage would also be helpful. The technique of “snowball sampling,” mentioned in Supporting Statement Part B, to select industry experts may lead to like-minded experts, rather than good coverage. In turn, the selection of industry experts will influence the selection of SA leaders and the establishment of best practices. It may be hard to establish the same best practices for all SAs due to differences in available resources, caseloads and methods of reporting (state/county/territory). Perhaps the SAs can be stratified and stratum specific guidelines/best practices established rather than a single set of guidelines/best practices. |
We have updated Section B.1 to incorporate additional information on how SA leaders and industry experts will be identified:
“A purposive (i.e., non-probability) sample of industry experts will be identified for the qualitative Industry Experts Interviews based on advice from FNS and 2M consultants. The goal is to include, if possible, some experts who have SNAP-specific experience, as well as some experts who work on cybersecurity for other SAs and federal agencies. Another group the study team will consider is faculty in university cybersecurity programs certified by the Department of Homeland Security (DHS). Throughout the selection process, the study team will use snowball sampling (asking each potential expert if they could recommend others with relevant experience) as a secondary approach for identifying additional members of the purposive sample of industry experts. The names of industry experts identified via snowball sampling methods will be reviewed by the study team, FNS, and 2M consultants to ensure sufficient diversity among the purposive sample.”
Regarding the reviewer’s comment about establishment of best practices, we draw attention to the exploratory nature of this study. In accordance with its exploratory nature, this study will employ a purposive (i.e., non-probability) sampling approach to identify SA leaders who will be asked during the interviews to provide insight into their practices for safeguarding personally identifiable information (PII). Accordingly, best practices identified in the interviews are not intended to be fully representative of the array of practices implemented by SAs (thereby mitigating the need for implementing a stratified sampling approach). During the analysis of interview data, the study team will employ a rigorous qualitative coding procedure to identify a robust set of best practices for safeguarding PII. The qualitative analysis will contextualize the findings to provide FNS and SAs with an understanding of the extent to which each of the best practices can be implemented by other SAs. |
|
Confidentiality may be an issue when numerical data (for example, Appendix B.2 question 5.17) is summarized (aggregated). Will the summarized data be reviewed for potential disclosure and appropriate disclosure avoidance techniques be employed as necessary? |
We recognize the importance of maintaining confidentiality throughout all phases of the study, and the study team has taken considerable steps to ensure confidentiality, as detailed under Section A.10. To address the reviewer’s comments, we have incorporated the following text:
“A collection of steps will be taken to ensure confidentiality is maintained in the final report and the associated public-use data files. To the extent feasible, any data that could be identified inferentially will be masked, and the study team will combine categories if there are fewer than five responses. For numeric responses, outliers may be top- or bottom-coded to prevent identification of the respondent. For interview data, quotations will be used in the report only with permission of the respondent and/or the State Director. As noted above, the data will be available to researchers only as restricted-use files, for which users are required to sign an agreement to protect PII.” |
|
All acronyms should be defined when first referenced, so that the general public can easily follow what is written (for example, in Supporting Statement Part A, FNS and EBT). |
Thank you for bringing this to our attention. We have defined the acronyms noted in the reviewer’s comments and reviewed the revised package to ensure that all acronyms have been properly defined. |
|
For clarity and ease of understanding, tables should include brief headings rather than reference to columns keys (for example, Supporting Statement Part A Table A.3 is difficult to understand while the table in Appendix E-1 is easier to follow). |
For Table A.3, we agree that presenting brief headings would be preferred to column keys. However, A.3 includes considerably more detail than the information included in the table in Appendix E-1 (i.e., the original burden response table included in the Federal Register 60-Day Notice). The complexity of A.3, which details the response burden estimates for all three modes of data collection, makes presenting the requisite information a challenging endeavor. The study team previously explored other options for presenting the table in the most parsimonious manner, which included incorporating brief headings for each of the columns. However, this greatly expanded the length of the tables and ultimately decreased readability. In the end, it was determined that the use of column keys for Table A.3 was the most effective approach for conveying the requisite information in a detailed but parsimonious manner. |
|
Tables should be reviewed for correctness and consistency of rows/ headings (for example, Supporting Statement Part A Table A.3 row Industry Experts Interview Recruitment Phone Call Script 2 should be removed. It is inconsistent with table in Appendix E-1). |
Please note that Table A.3 and the table in Appendix E-1 are not intended to be consistent with one another. The table in Appendix E-1 consists of the original burden response table included in the Federal Register 60-Day Notice. Since the publication of the Notice, the contents of Table A.3 were revised to reflect additional information learned from discussions with the Contracting Officer’s Representative and FNS staff, exploratory stakeholder interviews, and the survey and interview pretests. Finally, the study team conducted a full review of the other tables in the package to ensure consistency of rows/headings. |
|
Spelling should be reviewed and corrected (for example, Appendix G-12 should be Experts rather than “Exerts”). |
While this document has undergone several reviews by professional editors, we appreciate the reviewer bringing the issue in Appendix G-12 to our attention. We have replaced “exerts” with “experts” and conducted an additional review to ensure that no additional misspellings remain. |
OMB Number:
0584-#### 1231981BF0081
| Appendix E-
Expiration Date: ##/##/####
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | SNAP PII: Office of Management and Budget Information Collection Review Package |
Subject | 1231981BF0081 |
Author | Andrés Romualdo, MA |
File Modified | 0000-00-00 |
File Created | 2021-02-06 |