22_V. Interview Guide State Level Data Analysis Staff

Understanding Risk Assessment in Supplemental Nutrition Assistance Program (SNAP) Payment Accuracy

22_V. Interview Guide State Level Data Analysis Staff

OMB: 0584-0696

Document [docx]
Download: docx | pdf

V. Interview Guide: State-Level Data Analysis Staff



Shape1

OMB Number: 0584-#### Expiration Date: MM/DD/20YY







Introduction

Good morning/afternoon. Thank you for taking the time to talk with me today. My name is [interviewer’s name], and I work for Westat, a private research company based in Rockville, Maryland. Joining me is my colleague, [name].

Purpose: The U.S. Department of Agriculture’s Food and Nutrition Service, or FNS, is interested in understanding tools used to identify cases likely to have a payment error. These tools may be known by different names, such as case-profiling tools, risk assessment tools, or error-prone profiling. After cases are flagged, they undergo a rigorous process to ensure accurate benefit decisions. FNS hired Westat to conduct a study to learn more about the development and implementation of these tools. The findings from the study will be used to inform the development of case-profiling tools FNS and State agencies use, identify best practices, and develop resources and technical assistance.

How you were selected: We first conducted an online survey of all SNAP State directors. We then worked with FNS to select six State agencies among survey respondents for more indepth case studies on their use of case-profiling tools. After we discussed with the SNAP State Director what we hope to learn from your State agency, the State Director identified you as someone who would have valuable input and should be interviewed.

Risks and privacy: We use all data we collect only for the research purposes we describe. FNS knows which State agencies were asked to participate in each case study but does not know the names or job titles of the individuals interviewed. We will report the results of these interviews for each State agency, but your name will not be linked to your responses. In our reports, we may include direct quotes, but they will be presented without the speaker’s name or title. FNS will receive a redacted copy of the transcript of this interview that has been stripped of identifying information, except for the name of your State agency.

Study costs and compensation: There is no cost to you to participate apart from the time you spend with us for this interview, and there is no compensation. The interview takes 60 minutes.

Voluntary participation: Your participation is entirely voluntary. Refusal to participate will not have any impact on your position, your State agency, or nutrition programs. You may take a break, skip questions, say something off the record, or stop participating at any time.

Questions: If you have questions about your rights and welfare as a research participant, please call the Westat Human Subjects Protections office at 1.888.920.7631. Please leave a message with your first name; the name of the research study you are calling about, which is the SNAP Risk Assessment study; and a phone number, beginning with the area code. Someone will return your call as soon as possible.

We have planned for this discussion to last 60 minutes, until [time]. Is that still okay?

With your permission, I would like to record this discussion to help us fill any gaps in our written notes. The recordings, transcripts, and any notes we have will be stored on our secure server and will be destroyed after the project is complete. FNS will not receive any audio recordings.

Do you have any questions? [Answer all questions]

May I turn on the audiorecorder now? [Turn on audiorecorder if gives consent]

Now that the audiorecorder is on, do you agree to participate? [Pause for response]

And do you consent to be audiorecorded? [Pause for response]

Warmup and Context

  1. To start, please tell me how long you have worked at the agency and what your responsibilities are.

  2. For the rest of this discussion, we will be talking mostly about the [tool name], which your State agency provided information about in the online survey. What was the nature of your involvement with [tool name]?

[Probe: Designed it, tested it? Is familiar with tool but was not involved in development?]

  1. In the survey, the State agency said [tool name] is a [read survey response A18] that flags SNAP cases at risk of payment error using data from [read survey response A20]. Does that description still seem accurate?

  1. [If no] How would you revise the description?

Data Analysis [if relevant]

  1. To figure out which case characteristics to have the tool focus on, how did you know where the majority of payment errors were coming from?

    1. Tell me about any data you analyzed to determine where the errors were coming from.

[Probe: SNAP Quality Control data, vendor/contractor data, local agency data, other?]



  1. [If they analyzed data] What sort of analysis did you do on those data?

[Probe: Descriptive statistics, modeling, machine learning?]

    1. How did you decide on your analytical approach?

    2. What outcome or outcomes did you examine?

[Probe: Presence of a payment error? Payment error as a continuous measure?]

    1. How, if at all, did you test the accuracy of the analysis?

    2. What challenges arose when using those data to inform how the tool would be designed?

      1. How did you overcome those challenges?



Factors and Variables

Thank you. Those details gave me a really helpful background. Now I’d like to ask about the specific information the [tool name] looks for.

  1. My understanding from the survey is that, when the [tool name] tries to identify which cases are at high risk of payment error, it looks at [read survey responses A9–A13, A15a]. Did I capture that information correctly?

  1. [If no] What information does the [tool name] look at to flag SNAP cases at high risk of payment error?

  2. Is there any documentation you could share on each factor in terms of whether they are categorical, continuous, or measured in some other way?

[Note: Follow up on this at the end, and request any documentation]

  1. Can you recall how the decision was made to focus on those factors?

  1. Who made the decision?

  2. What, if anything, would you change about the factors the [tool name] focuses on?

  3. Can you recall how the decision was made to measure each factor categorically, continuously, or some other way?

    1. Who made the decisions?

    2. What, if anything, would you change about how the factors are measured?

  1. Were any factors considered for the tool that you didn’t end up using?

[If yes]

  1. Which ones?

  1. Why was it decided not to use them?

  1. Have the factors the [tool name] focuses on changed over time? If yes, how?

[Probe to understand whether changes were to focus on different factors altogether or on how those factors were measured—continuous, categorical, etc.]

  1. Why were the changes made?

[Probe to understand whether staff learned they needed to make adjustments as a result of monitoring or testing of the tool]

Testing the Tool [if relevant]

  1. [If survey response A7 NA] Tell me about how the [tool name] was tested before going live.

  1. What was being tested?

[Probe: Were they looking at the overall accuracy of the tool? Equity of the tool across subgroups? User-friendliness?]

  1. Who did the testing?

  2. Did those early tests reveal anything that needed to be fixed?

[If yes]

      1. Tell me a little about what needed to be fixed.

      2. How long did that take to resolve?

  1. Was the tool also tested after going live? If yes, how?

[Probe: What were they looking for with those tests—accuracy, equity, user-friendliness?]

  1. Did anything need to be fixed when testing the tool after it went live?

[If yes]

      1. Tell me a little about what needed to be fixed.

      2. How long did that take to resolve?

  1. If a State wanted to explore whether the [INSERT TOOL NAME FROM A1a or A1b] flags SNAP cases at risk of a payment error in a way that unintentionally affects a particular race, ethnicity, gender, or other protected class more than others, how do you think they could go about that?

[Note: For example, a tool may disproportionately flag certain ethnic groups (e.g., Hispanic households) if it looks for households with 8+ people.]

    1. Was that something your team considered during testing? Why or why not?

[If tested for unintentional effects]

      1. How did the team go about that?

      2. What were the findings?

      3. What changes, if any, were made after reviewing the findings?

  1. When you think about the whole development and testing process, to what extent did the team discuss the tool’s potential for unintentionally flagging a protected class?

[If discussed]

  1. What terms did the team use to describe this concern? Unintentional bias? Something else?

  2. How did these considerations factor into the tool’s construction?

[If not discussed]

  1. Did the team consider whether the tool might be more accurate for some subgroups than others?


Tracking Implementation

  1. What data, if any, are tracked on the outcomes of the cases the [tool name] flags?

[Note: We are asking if they track any followup steps taken for these cases, such as efforts to find additional documentation on the household or calls to the household to ask followup questions]

[If track data]

  1. Who tracks those data?

  1. [If tool is used after benefit determination] Do the data indicate whether the flagged cases were actually found to have payment errors?

[If do not track data]

  1. What data would you like to track, if any?

  2. What, if anything, makes it difficult to collect data on the [tool name]?

  1. What, if anything, do you report on the cases [tool name] flags?

  1. What information is shared in these reports?

  1. Who receives these reports?

[Probe: Local office staff, State-level staff, FNS staff?]

  1. What do the recipients do with this information?

  2. Would you be able to share the latest of these reports with us?
    [Note: Ask for this report again at the end of the interview]

  1. Aside from reporting, what other steps are you required to take based on the results of those reports?

[Probe to understand how these steps are carried out, when, and by whom.]

  1. We know this can be hard to determine, but have you been able to ascertain whether the tool has had an impact on payment error rates, either good or bad?

[Probe to understand whether the impact is their perception or based in data]



Wrap-Up

This discussion has been very helpful.

  1. If you were to talk to other State agencies considering implementing a similar tool, what advice would you give them?

  2. Before we wrap up, are there any key challenges to building or implementing case-profiling tools like [tool name] that we haven’t already discussed?

We’ve reached the end of the interview. Thank you so much for taking the time to talk with us and share your experiences. The information you provided gave us valuable insights into how tools like [tool name] work.

[If applicable] I recall you mentioning that you would be willing to share [documents] with me. Those would be really helpful to see, so thank you for offering to send them. You can send them to me at [email address]. I can also set up a secure FTP site to receive the materials if the documents contain identifying information.

This information is being collected to provide the Food and Nutrition Service (FNS) with key information on case-profiling tools used by SNAP State agencies. This is a voluntary collection, and FNS will use the information to examine risk assessment tools in SNAP. This collection requests personally identifiable information under the Privacy Act of 1974. According to the Paperwork Reduction Act of 1995, an agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a valid OMB control number. The valid OMB control number for this information collection is 0584-####. The time required to complete this information collection is estimated to average 0.75 hours (45 minutes) per response. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to U.S. Department of Agriculture, Food and Nutrition Service, Office of Policy Support, 1320 Braddock Place, 5th Floor, Alexandria, VA 22306 ATTN: PRA (0584-####). Do not return the completed form to this address. If you have any questions, please contact the FNS Project Officer for this project, Eric Williams, at [email protected].

Shape2

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKelley Calvin
File Modified0000-00-00
File Created2024-07-20

© 2024 OMB.report | Privacy Policy