How States Safeguard Supplemental Nutrition Assistance Program Participants’ Personally Identifiable Information
SNAP PII: Office of Management and Budget Information Collection Review Package | Contract #: 1231981BF0081
September 9, 2019
Submitted to:
Jenny Genser
Contracting Officer’s Representative
U.S. Department of Agriculture
Food and Nutrition Service
Prepared by:
Dallas Elgin, PhD
Anne Gordon, PhD
Andrés Romualdo
Ann Collins
Hiren Nisar, PhD
2M Research
1521 N. Cooper St., Suite 600
Arlington, TX 76011
The U.S. Department of Agriculture Food and Nutrition Service (FNS) contracted with 2M Research (2M) to conduct a study to better understand how States protect Supplemental Nutrition Assistance Program (SNAP) participants’ and applicants’ personally identifiable information (PII). The study uses a combination of a web survey administered to all 53 SNAP State Agencies (SAs) and interviews with industry experts and SNAP SAs that are recognized as leaders in safeguarding PII to examine how States currently safeguard participant PII and the consistency of safeguarding practices across States, as well as to provide recommendations for States to improve safeguarding of PII. In summer 2019, 2M conducted a series of cognitive pretests of the SNAP SAs survey and the industry experts and SNAP SA leaders interview protocols. This memo describes the pretest activities, summarizes the findings from the survey pretest and the pretests of the two interview protocols, and describes the subsequent revisions made to the survey and interview protocols. Copies of the pretest survey and interview protocols and the associated debriefing questionnaires are included in the appendices for reference.
Recruitment of Pretest Respondents and Pretest Activities
The three pretests included a total of seven respondents. The survey pretests were conducted with staff from four SNAP SAs, with multiple respondents on each conference call. Although not a full pretest, we also learned a great deal from an interview with a State Chief Information Security Officer who reviewed the survey instrument and provided comments. The interview pretests consisted of an interview with one industry expert and interviews with staff from two SNAP SAs.
2M and its consultants developed an initial list of potential pretest participants for consideration by FNS. The initial list was developed to ensure that there was enough variation across key factors—including geographic regions, caseload sizes, how agencies structured their approaches for using systems security professionals to safeguard PII, and whether the State’s SNAP program was State- or county-administered. Participants were subsequently recruited between May and July 2019 using a combination of emails from the Director of the SNAP Research and Analysis Division within FNS and emails and follow-up phone calls from 2M staff and consultants.
The maximum number of follow-up recruitment contacts was four emails and/or phone calls. The majority of potential pretest participants did not respond to the invitation email, with only three agreeing to participate in the pretest. The study team subsequently decided, with approval from FNS, to send follow-up emails and make follow-up phone calls to each remaining potential participant. Due to a lack of response, the study team elected to contact two alternate SNAP SAs and two alternate industry experts, which resulted in three additional agreements to participate. Overall, six SNAP SAs and one industry expert agreed to participate in the pretests.
SNAP SA Survey Pretest
The study team scheduled 2-hour pretest interviews for the survey. To minimize the burden of participating in this pretest, respondent SAs were asked to complete a hard-copy survey and answer cognitive interview questions during a single call. The study team emailed the survey to participants shortly before the call with instructions to print the survey in advance of their interview (if possible) but not to read or complete the survey until their interview. Instructing pretesting participants not to print the survey until shortly before the interview helped to ensure the authenticity of the responses and that the time required to answer each survey question was accurately captured and reflected within the subsequent burden estimates. During the pretest interviews, respondents completed the survey one section at a time and read their responses aloud to the interviewers. The interviewers timed each section and asked respondents debriefing questions to assess whether respondents understood the meaning of the questions; whether they had difficulty answering the questions; and whether the response options were applicable, clear, and comprehensive.
The time required by respondents to complete the survey ranged from 43 to 152 minutes.1 The average time needed to complete the survey was 94 minutes. Suggestions for revising the surveys to reduce response time are detailed in Exhibit 1.
SNAP SA Survey Pretest Findings and Revisions
This section summarizes the general findings from the pretest and the associated revisions made to the SNAP SA survey. Across the survey’s eight sections, respondents noted that the flow of the questions was appropriate, that there were several questions that they had a hard time answering, and that there were several instances in which unfamiliar terms were used. Upon completion of the pretest interview, respondents noted that the presentation of the interview and the questions styles were appropriate and that there were no additional questions that should have been asked during the interview.
Three prominent areas of feedback were provided by the pretest respondents. First, pretest respondents noted the key importance of having their information technology (IT) security experts involved in the survey, given the technical nature of many of the questions. Second, the pretest respondents noted the frequency with which the entire security process and oversight was outside of the SA—such as through a central SA or within data systems that administer multiple programs. Third, pretest respondents from a county-administered State noted the need to distinguish between questions focused solely on efforts at the State level and questions on county-level efforts. Exhibit 1 details the changes made to the SNAP SA survey instrument to address the feedback obtained during the pretest.
Exhibit 1. SNAP SA Survey Pretest Feedback and Subsequent Revisions
Survey Section |
Respondent Comments |
Survey Revisions |
Suggested Respondents |
Respondents consistently noted the importance of having IT security experts participate in the survey (and the more technical sections of the survey, in particular).
|
The “Suggested Respondents” identified at the beginning of each section were revised to replace “senior systems staff” with “Chief Information Security Officer from your agency or a central SA [or an individual designated by that person]”. |
Section Introductions |
Respondents from a county-administered State noted that it was difficult to distinguish whether the sections were referring to the SNAP SA or county-level agencies. |
Branching language displayed for county-administered States was included to acknowledge that the survey is primarily focused on the statewide safeguarding requirements established by the SA while also acknowledging the role of county-level agencies in certain procedures. The branching language was included within the following sections:
|
1.1 |
Respondents provided consistent feedback on the need to focus not on full-time or part-time system security professionals but rather on how the State structured its approach. |
The question and the responses were revised to focus on how the agency structured its approach for using systems security professionals dedicated to protecting SNAP PII. |
1.2 |
Given the comments on question 1.1, respondents also provided consistent feedback on the need to revise question 1.2 so that it no longer focused on full- and part-time system security professionals. |
The question and the responses were revised to focus on the staff members in or outside of the SA who are responsible for protecting SNAP PII. |
1.5 |
In addition to selecting the three response options, respondents consistently used the “Other. Please specify” option to identify the eligibility systems of other programs that were integrated with their SNAP eligibility system. |
The response options were expanded to include the three following options:
|
1.8 |
Feedback from one respondent noted the multiple responses could be checked for this question. Another respondent noted that the responses should also reflect the updating of data sharing agreements when there is a change in an agency’s data sharing processes. |
The question was revised to a “Select all that apply” format and the third response option was modified to reflect a “change in the data sharing processes used by one of the agencies.” |
1.9 |
Two respondents struggled with the term “fuzzy matching” and suggested that the term could have a negative connotation. |
While “fuzzy matching” is a term established within the technical and peer-reviewed literature, the term was revised to “probabilistic/fuzzy matching” to be clearer and to minimize any negative connotations. |
2.1 |
Two respondents struggled with the term “templates.” |
The “templates” response option was removed. The revised response options consist of the following:
|
2.3 |
One respondent noted the importance of the Privacy Act as a standard for States. |
“Privacy Act of 1974 (5 U.S.C. § 552a)” was included as a response option. |
2.4 |
One respondent expressed confusion about which security plan was being discussed. |
The question was revised to include the following language: “your SA’s system security plan for safeguarding PII of SNAP applicants and participants.” |
2.5 |
Respondents consistently noted that this question was challenging to answer. Some noted that the funding components were typically dealt with by another division, while others noted that they were unsure about who would know this information. |
We recommend dropping this question as it is not related to a specific Research Question. |
2.6 (2.5 in the revised version of the survey), 5.4, 5.16 (5.19 in the revised version of the survey), 8.1 |
A pair of respondents noted that they were not familiar with the “masking” term. |
The following hover definition was included to provide clarity: “Masking is the process of hiding sensitive data with modified content (i.e., characters or other data). For instance, Social Security numbers may be masked by replacing the first five digits with an asterisk and only showing the last four digits.” |
2.7, 2.8 (2.6, 2.7 in the revised version of the survey) |
Upon completion of the pretests, the study team made minor edits to these questions to improve clarity and eliminate repetition. |
For 2.7, additional language was incorporated to reflect to role of the SA’s system security professionals and to replace “EBT vendors” with “EBT contractors”. For 2.8, additional language was included to reflect that SA’s may use other types of risk planning tools. |
2.9 (2.8 in the revised version of the survey) |
One respondent expressed confusion about which security plan was being discussed. |
The question was revised to include the following language: “system security plan for safeguarding PII of SNAP applicants and participants.” |
3.1 |
Upon completion of the pretests, the study team made minor edits to this question to improve clarity and eliminate repetition. |
The “Data entry staff” response option was determined to be redundant as the question references “eligibility workers” who would be responsible for entering data. |
3.2 |
Two of the respondents noted that multiple role-based security levels were often implemented. |
The response format was revised to “Select all that apply.” |
3.3, 3.4 |
Upon completion of the pretests, the study team made minor edits to these questions to improve clarity and eliminate repetition. |
References to “EBT vendor” were replaced with “EBT contractor” to avoid confusion with SNAP vendors, as commonly understood. |
3.9 |
Two of the respondents noted that they struggled with the ordering of the response categories and suggested a need to reorder the categories. |
“Meeting requirements, with room for improvement” was moved to the first column while “meeting requirements” was moved to the second column. To maintain consistency across questions, this change was also made to questions 4.8 and 5.16. |
4.1, 4.4, 4.5 |
Upon completion of the pretests, the study team made minor edits to these questions to improve clarity. |
Branched versions of these questions were constructed for the 10 county-administered states. These branched versions questions acknowledge that SAs or county employees may be involved with safeguarding PII. |
4.4 |
One respondent expressed that “remote access” was unclear and sought to clarify whether a VPN connection would fit within this definition. |
The question was revised to include the following parenthetical example after “remote access”: “(such as a VPN connection).” |
5.5 |
One respondent noted that they were confused by the phrase “online application” within this section, as it could be confused with an online IS/IT application. |
The question was revised as follows: “What methods does your SA use to safeguard PII that is submitted by SNAP applicants or participants via online forms?” The response options were subsequently revised to replace “users” with “applicants/participants.” |
5.6, 5.8, 5.10, 5.11, 5.14 (5.6, 5.9, 5.10, 5.11, 5.12, 5.15 in the revised version of the survey) |
Upon completion of the pretests, the study team made minor edits to these questions to improve clarity and eliminate repetition. |
Minor changes included simplifying question language (5.6); clarifying that one response should be selected per row (5.9); correcting the skip logic (5.11 and 5.12); and including “Don’t know/unsure” as a response option (5.15). |
5.11a–5.15 (5.12-5.18 in the revised version of the survey) |
Two of the respondents expressed concern about discussing security breaches. |
To alleviate any concerns about privacy, additional language was included to remind respondents that their answers would be kept private. In addition, the term “breach” was replaced with “incident” to reflect the more common occurrence of security incidents in which cybersecurity systems are not breached and PII is not accessed.
In addition, questions 5.11a through 5.16 were renumbered to 5.12 through 5.19. |
5.12 (5.14 in the revised version of the survey) |
Respondents noted this question could be broadly interpreted to include multiple types of security incidents. |
The question was revised to clarify that the focus is on security incidents in which PII was compromised. |
Section 5 |
Respondents from a county-administered State expressed that Section 5 was challenging to answer as currently structured. They noted that in their system, it was county agencies and not the SNAP SA that were involved in the SNAP application and recertification processes. |
A branching section for Section 5 of the survey was incorporated. In this branched section of the survey, the 10 county-administered States would see a different version of the section, with questions that more accurately reflect the role of county agencies. |
6.2 |
Upon completion of the pretests, the study team made minor edits to this question to improve clarity and eliminate repetition. |
Question 6.2 was revised to clarify that respondents should select all responses that apply. |
7.2 |
Two respondents noted that they did not have a formal data-sharing plan. Rather, their agencies typically use data use agreements or Memorandums of Understanding (MOUs). |
We recommend dropping this question given that data-sharing plans are not typically used and given that questions on data use agreements and MOUs are included within Section 1 (SA Systems Context). |
7.3 (7.2 in the revised version of the survey) |
Upon completion of the pretests, the study team made minor edits to this question to improve clarity and eliminate repetition. |
The question was revised to clarify the focus on data files or information “containing SNAP PII”. In addition, the first response option was revised to clarify that direct access included “application-to-application access”. |
7.4 (7.3 in the revised version of the survey) |
Two respondents expressed that it was unclear who the question was referring to (i.e., whether the question was referring to the agency sending or receiving the data files). |
The question was revised to provide clarity that the focus is on what the SA does with the matched file that it created. |
7.6 (7.5 in the revised version of the survey) |
One respondent noted that some agencies may take the approach of not sharing data with law enforcement, in general. |
The following response option was added: “We do not share data with law enforcement (unless directed to do so via a court order).” |
Interview Pretests
The study team scheduled 2-hour pretest interviews for the industry experts and SNAP SA leaders. Three pretest interviews were scheduled, with the selected industry expert and the two selected SAs. Using the draft semi-structured interview protocols, the interviewers timed the administration of each section and then paused to ask the respondents a series of questions pertaining to the section. These questions focused on how the respondents interpreted and answered the questions and whether the questions were appropriate, clear, and worded correctly.
The respondent required 45 minutes to complete the industry expert interview. In contrast, the SNAP SA leaders’ interviews were 72 and 74 minutes, respectively. Suggestions for revising the interview protocols to reduce response time are detailed in Exhibits 2 and 3.
Industry Expert Interview Pretest Findings and Revisions
This section summarizes the general findings from the industry expert interview pretest. The pretest respondent was highly knowledgeable of State cybersecurity issues, as the respondent had worked on cybersecurity issues with 30 to 35 States throughout their career and had conducted a biennial survey with Chief Information Officers from the 50 States. During the debriefing for the interview’s four sections, the respondent noted that the questions flowed appropriately, were not difficult to answer, and did not use terms that were unfamiliar. The study team made two changes to the interview protocol based on the respondent’s feedback, as shown in Exhibit 2.
Exhibit 2. Industry Expert Interview Pretest Feedback and Subsequent Revisions
Question # |
Respondent Comments |
Interview Revisions |
8 |
The respondent noted that SNAP SAs were more likely to have multiple security plans than a single comprehensive security plan. Accordingly, this may be a difficult question for industry experts to answer (as they may not be knowledgeable of the various security plans). |
We recommend dropping this question given the feedback provided by the respondent. |
10 (9 in the revised version of the protocol). |
The respondent noted that while FNS Handbook 901 plays a role, the information systems containing PII are also likely to be guided by an array of security requirements from other Federal agencies. |
The question was revised to include the following language: “The information systems containing SNAP PII data may be guided by an array of security requirements established by Federal agencies (such as FNS Handbook 901, NIST guidelines, HIPAA, CMS MARS-E). In addition to these requirements, are there industry best practices that SNAP State Agencies should consider implementing for the following processes?” |
SNAP SA Leaders’ Interview Pretest Findings and Revisions
This section summarizes the general findings from the SNAP SA leaders’ interview pretest and the associated revisions made to the interview protocol. Across each of the three sections, respondents noted that the flow of the questions was appropriate, that there were not any questions that they found difficult to answer, and that no unfamiliar terms were used. Upon completion of the pretest interview, respondents noted that the presentation of the interview and the questions styles were appropriate and that there were no additional questions that should have been asked during the interview.
One area of consistent feedback from the respondents pertained to the interview’s second question. Respondents noted it would be helpful to know ahead of time that the interview would ask for an overview of the State legislation and regulations that govern the agency’s handling of PII. Accordingly, it was agreed that the interview recruitment materials used for the full study should explicitly note the need for SAs to review applicable State legislation and regulations prior to the interview. Another area of consistent feedback was that SAs could not provide detailed answers to questions about the approaches of other SAs, as the SAs were largely unaware of what their peers in other States were doing to safeguard PII. Respondents further elaborated that there was a subsequent need for FNS to convene meetings or provide forums for SAs to discuss their approaches to safeguarding PII. Upon further discussion, it was agreed that this could be a pertinent finding of the SA leaders’ interviews within the full study and that no changes should be made to questions asking about the approaches of other SAs. Exhibit 3 details the changes made to the SNAP SA leaders’ interview protocol to address the feedback obtained during the pretest.
Exhibit 3. SNAP SA Leaders’ Interview Pretest Feedback and Subsequent Revisions
Question # |
Respondent Comments |
Interview Revisions |
Introduction |
Upon completion of the pretests, the study team incorporated additional language in the introduction to improve clarity for county-administered states. |
The following language was included in the introduction: If SNAP State Agency oversees or has policy responsibility for a county-administered SNAP program: Within county-administered systems, the SNAP State Agencies are responsible for establishing statewide safeguarding requirements in accordance with federal policies, while county-level agencies are given discretion in how to best meet or exceed the requirements set by the SNAP State Agency. Accordingly, this interview is primarily focused on the statewide safeguarding requirements established by your SNAP State Agency as opposed to the individual requirements established by county-level agencies. |
2 |
Respondents from both States noted that it would be helpful to know in advance that the interview would ask respondents to provide an overview of State legislation and regulations. 2M agreed to explicitly note this need within recruitment materials and other communications occurring prior to the interview. |
2M revised question 2 to include the following language at the beginning of the question: “We noted in our previous emails that it would be helpful for your agency to review applicable State legislation and regulations that govern the handling of PII.” |
Appendix A.1: Questions from the SNAP SA Survey Pretest Debriefing Protocol
Questions asked at the end of each section:
What did you think of the flow of the questions? [probe: for specifics about the flow – order of question presentation, topic order, etc.] Do you have any suggestions to make the survey questions flow better?
Were there any questions you had a hard time answering (either because you did not know the answer or because you did not understand the question)? [probe: for reasons why and suggestions for improvement]
Were there any terms that were used that you are unfamiliar with? [probe: for specific terms] Do you have suggestions of other ways to phrase these terms, or would you suggest that we add a definition? Did you find the definitions helpful?
Questions asked upon respondents’ completion of the survey:
Overall, what did you think about the presentation of the survey? Question style?
Was there any information you would have liked to have in advance to help better prepare you to answer any of the questions? If so, what information specifically? For which question(s) would this information have been helpful?
Do you think there is anything else we should know or should have asked as we finalize the survey? Do you have any other suggestions for improving the survey? [probe: for anything else beyond what they first mention–anything else?]
Appendix A.2: Questions from the Industry Exerts Interview Pretest Debriefing Protocol
Questions asked at the end of each section:
What did you think of the flow of the questions? [probe: for specifics about the flow–order of question presentation, topic order, etc.] Do you have any suggestions to make the interview questions flow better?
Were there any questions you had a hard time answering (either because you did not know the answer or because you did not understand the question)? [probe: for reasons why and suggestions for improvement]
Were there any terms that were used that you are unfamiliar with? [probe: for specific terms] Do you have suggestions of other ways to phrase these terms, or would you suggest that we add a definition?
Questions asked upon respondents’ completion of the interview:
Overall, what did you think about the presentation of the interview? Question style?
Was there any information you would have liked to have in advance to help better prepare you to answer any of the questions? If so, what information specifically? For which question(s) would this information have been helpful?
Do you think there is anything else we should know or should have asked as we finalize the interview? Do you have any other suggestions for improving the interview? [probe: for anything else beyond what they first mention–anything else?]
Appendix A.3: Questions from the SNAP SA Leaders’ Interview Pretest Debriefing Protocol
Questions asked at the end of each section:
What did you think of the flow of the questions? [probe: for specifics about the flow–order of question presentation, topic order, etc.] Do you have any suggestions to make the interview questions flow better?
Were there any questions you had a hard time answering (either because you did not know the answer or because you did not understand the question)? [probe: for reasons why and suggestions for improvement]
Were there any terms that were used that you are unfamiliar with? [probe: for specific terms] Do you have suggestions of other ways to phrase these terms, or would you suggest that we add a definition?
Questions asked upon completion of the interview:
Overall, what did you think about the presentation of the interview? Question style?
Was there any information you would have liked to have in advance to help better prepare you to answer any of the questions? If so, what information specifically? For which question(s) would this information have been helpful?
Do you think there is anything else we should know or should have asked as we finalize the interview? Do you have any other suggestions for improving the interview? [probe: for anything else beyond what they first mention–anything else?]
1 The variation in time required to complete the survey can be partially attributed to the significantly larger amount of time required by a single State.
OMB Number:
0584-#### 1231981BF0081
| Appendix G-
Expiration Date: ##/##/####
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | SNAP PII: Office of Management and Budget Information Collection Review Package |
Subject | 1231981BF0081 |
Author | Andrés Romualdo, MA |
File Modified | 0000-00-00 |
File Created | 2021-02-06 |