Attachment D Pretest Summary_9-26-13

OMB-SNAP QC-Attachment D Pretest Summary_10-28-13.docx

Enhancing Completion Rates for SNAP(Supplemental Nutrition Assistance Program) Quality Control Reviews

Attachment D Pretest Summary_9-26-13

OMB: 0584-0590

Document [docx]
Download: docx | pdf






U.S. Department of Agriculture

Food and Nutrition Service







Enhancing Completion Rates for SNAP

(Supplemental Nutrition Assistance Program) Quality Control Reviews



Request for Clearance

Supporting Statement and

Data Collection Instruments


Attachment D:
Results of Semi-Structured Interview and Survey Pre-Test


Project Officer: Robert Dalrymple








September 26, 2013





May 3, 2013

OMB Control Number: 0584-XXXX
Expiration Date: XX/XX/XXXX



MEMORANDUM



To:

Bob Dalrymple

FNS Project Officer



From:

Stéphane Baldi, Executive Project Director

Brittany McGill, Deputy Project Director

Meg Tucker, Task Lead



Subject:

Results of the Pre-Test to Assess the Semi-Structured Interview and Survey for the study entitled “Enhancing Completion Rates for SNAP Quality Control Reviews.”




This memorandum reports the pre-test results for the surveys and semi-structured interview protocols developed for the FNS study entitled, “Enhancing Completion Rates for SNAP Quality Control Reviews.” The purpose of the data collection instruments is to:


  • Gather information that will assess the procedures followed by SNAP Quality Control (QC) reviewers that lead to designating a case as incomplete,

  • Determine whether cases are being reviewed and processed correctly,

  • Describe the overall process of conducting a QC review at the State and regional levels, and

  • Identify the potential problems QC reviewers face when attempting to complete cases, along with possible solutions.


The pre-tests sought to measure adherence to the suggested timeframe and to determine whether questions were written clearly and appropriately captured data that were most relevant to the research questions and objectives of the study. This memorandum 1) summarizes the findings from the pre-tests of the interviews and surveys, and 2) proposes revisions to improve the instruments for future interviews and surveys.


Four instruments were pre-tested at the State level. Table 1 displays the number of instruments pre-tested by location and type of instrument. Two versions of the semi-structured interview instruments were pre-tested: a State director/supervisor version and a State quality control reviewer (SQCR)1 version. In addition, two versions of the survey instruments were pre-tested: a State director/supervisor version and an SQCR version. These four instruments were pre-tested with State staff from two QC offices: North Carolina (NC) and the District of Columbia (DC). The NC interviews and surveys were conducted over the phone and included a semi-structured interview with the State QC director and three surveys with SQCRs. In DC, the instruments were pre-tested in person. Although the supervisor instruments were not

pre-tested due to lack of personnel availability, these instruments are similar to the director instruments, and relevant findings from the other pre-tests will be applied to the supervisor instruments. Each survey and interview was conducted in English. Respondents included six women and two men.


Table 1
Pre-Test Instruments by Location


Instrument


State

SQCR
Survey

SQCR Semi-Structured Interview

Director Survey

Director Semi-Structured Interview

Total

NC

3

0

0

1

4

DC

2

1

1

0

4

Total

5

1

1

1

8


Section A of this memo describes the findings from the pre-tests of the State instruments regarding the duration of the instruments and proposes minor revisions to keep the instruments within the targeted timeframe. Section B summarizes five overall themes that emerged from the pre-test and suggested areas of needed revisions, and proposes revisions to address those themes. Appendix A provides detailed information on proposed question-by-question revisions for each State instrument.

A. Duration of the instruments

One of the primary purposes of the pre-test was to assess the length of time required to complete the interviews and surveys. This section describes the findings from the pre-tests of the State surveys and interviews with regard to duration and suggests ways to keep the data collection within the targeted timeframe.


The median time to complete the SQCR survey was 39 minutes. Two of the initial three survey pre-tests in NC lasted longer than the expected timeframe because reviewers felt the need to explain several of their answers. This led to anecdotal stories which further increased the time of the survey. Since it is in the interest of the project to minimize the burden of these surveys and interviews on SNAP QC staff, we added language to subsequent pre-tests in DC to remind respondents of both the survey timeframe and answer format. After emphasizing the limited timeframe and pointing out the opportunity to provide additional comments at the end of the survey, we were able to substantially shorten the time it took to administer the survey to 30 minutes.


Pre-tests of the semi-structured instruments did not maintain the expected timeframe for interview completion. We expected each interview to last approximately 60 minutes. The director semi-structured interview lasted 95 minutes, while the SQCR interview lasted 52 minutes. While the SQCR interview did not exceed the allotted time frame, we were unable to cover all topics of interest before having to end the interview due to the time constraints of the site visit agenda. Deviations from the time and completion schedule were related to the interviewee’s tendency to roam off topic from the questions at hand. While this is a common occurrence in qualitative data collection and an advantage of a semi-structured design, it must also be balanced against enough adherence to the interview instrument to collect all necessary data.


Proposed Revisions

We propose adding a sentence to the introduction of the survey explaining that the survey format includes multiple-choice and short answer questions to keep the survey as brief as possible. We noted additionally that there will be an opportunity at the end of the survey to expand on responses that cannot be summarized through the multiple-choice options.


We propose adding language to the introduction of the semi-structured instruments to remind respondents of the 60-minute timeframe and to list the topics that will be covered in the discussion. These initial steps should limit off-topic responses.


Finally, we have identified some questions in the director interview and survey that can be dropped to shorten the length of those instruments. Among these questions are those that pertain to the details of the QC review process; pre-test responses from the directors were much less detailed and informative than the same questions from the reviewers. These questions are discussed under Theme #4 below and detailed changes are suggested in Appendix A.

B. thematic Findings suggesting areas for revision

Overall, five key themes emerged from the pre-test of the State instruments that suggested areas for revisions. These themes included:


  1. Respondents’ difficulty providing numbers and estimates

  2. Variation in case characteristics and circumstances

  3. Variation in SQCR roles within QC offices

  4. The need to focus the instruments on responsibilities unique to respondents’ duties

  5. New areas of inquiry


Below we describe each of these themes and the revisions we propose implementing to improve the instruments. Details of question-level revisions are provided in Appendix A.


1. Difficulty Providing Numbers and Estimates.

On several occasions, reviewers became confused by questions that asked for numeric estimates of certain aspects of their caseloads. The most notable of these issues was confusion surrounding the use of percentages to describe aspects of incomplete cases. For example, one question asks, “What percentage of your SNAP QC cases refused to cooperate with the review?” This question refers to a portion of incomplete cases. The national average for completion rates is 92 percent. Both NC and DC have completion rates higher than the National average (98 percent and 95 percent, respectively). High levels of completion rates such as these translate to only one or two incomplete cases every 2 to 3 months per reviewer, and remarking on these incidences, as well as detailed breakdowns of these instances, as percentages was confusing for the reviewers. Additionally, reviewers often preferred to respond to numerical questions with a range.


Questions asking for specific numbers in the reviewers’ caseload for each of the previous 3 months took a long time to answer. Reviewers from NC preferred to access their records to provide specific numbers rather than estimates of typical monthly caseloads, which added several additional minutes to the time it took to administer the survey. This question was written to ask for estimates that reviewers could provide from memory, but reviewers were uncomfortable providing estimates from up to 3 months back.


Finally, respondents had difficulty estimating the amount of time it takes to complete a case or to complete some portion of the review process. Rather than tackling cases one at a time, reviewers often multitask the case file reviews while giving backlogged cases priority. Reviewers typically manage their workload by spending relatively small increments of time on various steps in the process spread across a longer period of time, making it difficult to estimate the total time spent on a case or on individual steps in the review process. Responses to these questions varied widely and did not appear to provide reliable estimates of duration. While some respondents, for example, reported a very short amount of time to conduct the overall review, others reported the entire length of time they have to conduct the review.


Proposed Revisions

We propose revising the wording of some questions asking for numeric estimates that were problematic in the pre-tests. In some cases, we propose revising questions about percentages of incomplete cases to ask for numbers of cases. In other cases, we suggest revising the questions to ask specifically about the most recent incomplete case a reviewer coded from his or her caseload, rather than asking about trends among an individual reviewer’s incomplete cases. For questions highlighting the single most recent case a reviewer coded as incomplete, aggregate responses from the approximately 225 surveys can be analyzed to provide a current snapshot of incomplete cases overall.


To address the issue of responses to numerical questions with ranges, we recommend including additional interviewer instructions to probe for a concrete numerical value when given a range response.


We propose removing the questions that asked for specifics about caseload assignment numbers from each of the previous 3 review months. We would like to revise the introductory question to this sequence of questions so that we capture the total number of cases assigned to a reviewer in the previous month, rather than the average monthly number of cases. We also propose consolidating the monthly follow-up questions to a single question, “Of these cases, how many were active, negative, and other type of review (other than SNAP QC)?”


Finally, we recommend omitting from the SQCR survey questions about duration of the overall review process or increments of the review process that did not produce reliable estimates of time. On the other hand, we suggest retaining the questions about the total time provided to reviewers to complete the reviews and about whether reviewers are given interim deadlines throughout the process. In addition, we would like to continue to ask the more detailed questions about the duration of the review process in the semi-structured instruments, where interviewers can better elicit the nuances of timeliness of the case review process in the context of multitasking.


2. Variation in Case Characteristics and Circumstances.

Reviewers felt compelled to clarify many of the situations that could not be captured in a single question or measure. Case completion rates and reasons varied by characteristics of the SNAP household. Cases involving homeless clients, for example, were often cited as cases that took more time and involved more complicated procedures than cases where the household has stable housing and a regular means of contact. Reviewers also commented on varying reasons for incomplete cases based on geographic location, citing differences between urban and rural households. These variations made it difficult for reviewers to provide a single summary response to questions that may encompass a wide variety of case characteristics.


Proposed Revisions

We propose adding descriptive questions in the caseload section of the SQCR survey and the SQCR semi-structured interview instruments to measure the extent to which a reviewer works with homeless households or in urban, rural, and/or suburban locations. We will examine these factors during the analysis stage to look for trends among household demographics and reviewer responses regarding incomplete cases.


3. Variation in SQCR Role.

The staffing structure of the QC office varies by State. In NC, the reviewers we interviewed were second-level analysts. These reviewers conduct reviews, but also evaluate lower-level reviewers’ cases. In DC, different grades of analysts receive different workloads. Because reviewers had multiple roles in the QC office, they often had trouble discerning the appropriate role to cite when responding to a question. Several NC reviewers began responses with introductions such as, “when I’m wearing my reviewer hat,” to indicate the different responsibilities that came with each role.


Proposed Revisions

We recommend adding language to the introduction of the SQCR survey instrument to specify that this survey asks questions about the roles of, and procedures used by, State SNAP QC reviewers, and that questions should be answered based on experiences that the reviewer has conducting case file reviews.


4. Focus on Responsibilities Unique to Respondents’ Duties.

The SQCR and director interviews and surveys devote several questions to the case file review process. The SQCR semi-structured interview respondent provided rich detail on the case file review process, providing anecdotal stories to illustrate themes, and providing specific detail about various populations and the challenges they present. We found this to be equally true from the explanations we received from SQCR survey respondents. The semi-structured interview with the State director provided more general descriptions of the case file review process that lacked the details provided by the SQCRs. Given that the semi-structured interviews are scheduled to take an hour, we feel that collecting information about the case file review process from directors may be redundant and take up limited time, wherein directors can speak to issues unique to the director or supervisor position.


Proposed Revisions

To allow for more time and focus on interview questions that are unique to the director and supervisor position, we propose dropping questions about the case file review process from the director/supervisor interview that are redundant to the SQCR semi-structured interview.


5. New Areas of Inquiry.

Finally, two areas emerged as important concepts in the pre-test that were not adequately or explicitly addressed in the draft data collection instruments: 1) safety concerns and 2) the use of “likely conclusion” when completing a review. Both appear to be potentially important concepts for understanding complete and incomplete cases.


Safety concerns were mentioned in both the surveys and semi-structured interviews, despite not being explicitly included in the draft instruments. While several respondents explained that reviewer safety was a priority over interview completion, the SQCR semi-structured interview respondent also brought up concerns about SNAP client safety. This interview revealed that many participants feel unsafe leaving their neighborhoods to attend interviews and that certain times of the day are best for making home visits, both in terms of the reviewers’ safety and in terms of when clients are most likely to open their doors. We feel that these are significant issues that may affect case completion and that have not been adequately addressed in the draft survey and interview instruments.


The pre-test also revealed potential variation in the prevalence of using “likely conclusion” when completing reviews, an area we had not directly explored in the draft data collection instruments. The NC director’s interview, for example, suggested that use of “likely conclusion” may be the reason why some States’ completion rates are higher than others, and that training in the use of “likely conclusion” may be one method for improving completion rates.


Proposed Revisions

We suggest adding two questions to the SQCR semi-structured interview regarding safety. One question captures concerns surrounding reviewer safety and one question captures safety concerns from the perspective of the SNAP client and how those safety concerns may affect participation in the review. Similar questions are proposed for the SQCR survey.


We also recommend adding a question about whether and how often reviewers use “likely conclusion” to the SQCR survey and semi-structured interview instruments. Asking this question of all States will enable us to assess whether there is an association between the usage of “likely conclusion” and completion rates.


Appendix A: Changes to State Instruments

This appendix provides detailed information on the proposed revisions to questions in the State data collection instruments based on the findings of the pre-test. Given the length and number of the instruments, only questions affected by proposed revisions are included here. Changes include those resulting from the themes described in the memo above, as well as minor improvements to make wording consistent, to correct an errant skip pattern, etc. Appendix A.1 presents the proposed revisions to the SQCR survey, and Appendix A.2 presents the revisions to the State director/supervisor survey. Appendix A.3 presents the revisions to the SQCR semi-structured interview, and Appendix A.4 presents the revisions to the SQCR director/supervisor semi-structured interview.





















APPENDIX A.1.
SQCR SURVEY

Section B: SNAP QC Caseload

To address the issue of instrument duration (Section A of memo), we proposed additional language in the introduction of both surveys to remind respondents of the surveys format and to inform them that there is an opportunity to expand on responses in an open-ended question at the end of the survey. The additional language is as follows:

The survey format includes multiple choice and short answer questions to keep the survey as brief as possible. There is an opportunity at the end of the survey to expand on responses that cannot be summarized through the multiple-choice options.


To address the issue of varying roles of State QC reviewers (Theme #3: Variation in SQCR roles), we propose adding the following interviewer instructions at the beginning of this section:


Interviewer note: On the occasion that you interview a reviewer who has multiple responsibilities at the QC office (i.e., second-level analyst, or other position that is not strictly a reviewer), please note that for these questions, we are only interested in responses that pertain to SNAP SQCR responsibilities.


The survey pre-test revealed that former Q4 (“What percentage of your time is dedicated to SNAP QC?”) did not properly align with previous skip patterns. If reviewers answered that they did not conduct reviews for other programs, asking for a percentage of their time spent working with SNAP QC was redundant. To address this, we moved former Q4 to Q3c, where it would be asked only of reviewers with other program affiliations whose responses would generate the skip patterns that would lead them to this question. Reviewers with no program affiliation outside of SNAP QC would not answer Q3c.


3c. What percentage of your time is dedicated to SNAP QC?___% [Enter Percent]

4. What percentage of your time is dedicated to SNAP QC?___% [Enter Percent]


Reviewers did not have trouble responding to Q4 or Q4a (formerly Q5 and Q5a)2, given that Q4 asks for an average number and that Q4a asks for numbers from the last month. Reviewers did not have trouble responding with their current workloads. However, given the difficulties that arose from the follow-up questions formerly numbered 5b-5d, we recommend re-wording questions Q4 and Q4a to capture a current snapshot of QC caseloads across the country and omitting former questions 5b-d. The proposed new wording (in italics) for Q4 and Q4a is:


4. Approximately how many SNAP QC reviews were assigned to you at a time last month?

___ [Enter number]


4a. Of these reviews, how many were assigned during the current month how many are:

___ [Enter number] Active

___ [Enter number] Negative

___ [Enter number] Other types of reviews (other than SNAP QC)


(5b-d.) How many were assigned (Q5b) the previous month/ (Q5c) 2 months ago/ (Q5d) 3 months ago?

___ [Enter number] Active

___ [Enter number] Negative

___ [Enter number] Other types of reviews (other than SNAP QC)


Questions 7 and 8 are proposed new questions to address the ongoing concern that many of the procedural and incomplete case-related questions did not account for the situations that arose from contacting and locating households that were homeless, rural, or urban (Theme #2: Variation in Case Characteristics and Circumstances). Adding these descriptive caseload questions will allow the analysis team the opportunity to compare reviewer responses to certain household demographics. The addition of these questions also will act as a reference point for the interviewer if the respondent is ambivalent about providing a response due to variations in his or her caseload; the interviewer can point out that the survey already has an understanding of the amount of time he or she spends with each demographic and that will be considered when tabulating responses.


7. In geographic terms, what is the primary type of population you most often work with?

a. Urban

b. Rural

c. Suburban

d. A mix of urban and suburban

e. A mix of rural and suburban

f. All of the above


8. In a typical month, how many of your households are homeless?

___Enter number


Section C1: Overview of QC Active Case Review Procedures

After reviewing the study’s research questions, we revisited the value of asking former questions 16-18, which refer to the time it takes to complete a case. Respondents appeared to have difficulty estimating the amount of time actually spent on a case, which is typically small increments of time spread over a much longer period of time (Theme #1: Difficulty Providing Numbers and Estimates). We found that these questions do not appear to produce reliable responses and therefore add little value to the overall findings of the study. Questions 12 and 13 (formerly Q11 and Q12), however, appear to provide more reliable estimates of case file review timeframes. As a result, we propose keeping questions 12 and 13 (below) but omitting former questions 16-18 in the revised instrument.


12. How many days are you given to complete SNAP QC reviews and submit them as final?

a. <60

b. 60-75

c. 75-95

d. 95+


13. Are you given interim deadlines throughout the process that you are required to meet?

a. No

b. Yes


(16.) On average, how long does it take you to locate and contact a household?

___Days


(17.) On average, how long does it take you to complete a household interview once the client agrees to participate?

___Minutes


(18.) On average, how long does it take you to conduct an entire QC review from the time the case is assigned to you to the time that you are no longer responsible for it?

___Days


We also propose adding two questions to this section as a result of the safety concerns voiced during the pre-tests (Theme #5: New Areas of Inquiry). Both reviewers and clients face safety concerns when conducting reviews and arranging interviews. The issue of safety was not adequately or directly addressed in the draft instruments; however, it appears to be an important element to consider to fully understand complete and incomplete cases. As a result, we recommend adding questions to both SQCR instruments to capture reviewers’ experiences regarding safety concerns from both the reviewer and household perspective. The questions proposed below could be added to the “Overview of QC Active Case Review Procedures” section:


16. How often do concerns for your safety affect your ability to contact or locate a household when conducting a review?

a. Often

b. Sometimes

c. Rarely

d. Never


17. How often do SNAP clients express safety concerns that hinder the ability to complete a review?

a. Often

b. Sometimes

c. Rarely

d. Never


Section C2: Locating and Contacting Households

Each of the responses to Q19 (formerly Q19 also) was in a range lower than the option for the smallest percent measurement. In response, we lowered the percentile ranges in our response options to be more indicative of the answers given to us during the pre-test. We believe it is notable that reviewers did not have trouble providing percent-based answers regarding their entire caseloads; confusion about percentages occurred later in the survey when they were asked to speak about incomplete cases, which make up a much smaller amount of their caseloads (Theme #1: Difficulty Providing Numbers and Estimates).


19. What percent of cases do you successfully contact on your first attempt?

a. 85% <50%

b. 85-90% 50-70-%

c. <90-95% 70-90%

d. 95+% 90+%


Former Q21 was duplicative with former Q16. Both questions were removed because we found that the question did not produce reliable responses (Theme #1: Difficulty Providing Numbers and Estimates).


21. How long, on average, does it take you to complete the household interview once the client agrees to participate?

____Days


We changed the wording of Q21 (formerly Q22) to capture a numerical value rather than a percentage as this provides more reliable data (Theme #1: Difficulty Providing Numbers and Estimates).


21. What percentage of During a review period, how many households selected for QC review cannot be located?

___%


For wording consistency, we suggest changing the question and response in Q21a (formerly Q22a) from “remained constant” to “stayed the same,” to mirror similar questions and responses in the survey. We also added a response option to capture the incidence of never being unable to locate a household


21a. Has that number increased over time, decreased over time, or remained constant stayed the same?

a. Increased over time

b. Decreased over time

c. Stayed pretty constant the same

  1. Never been unable to locate a household


Reviewers had difficulty comprehending the wording in Q22 (formerly Q23). We recommend rewriting this question as follows to provide more clarity.


22. In the event a household cannot be located, to what extent are SNAP QC reviewers free to take additional steps beyond the FNS 310 Handbook instruction that a minimum of two follow up attempts should be made with collateral contacts? Would you say that___? The FNS 310 Handbook mentions that reviewers must make a minimum of two follow-up attempts to contact a household. To what extent are QC reviewers in your State encouraged to go beyond that two-step minimum?

a. Additional steps are strongly encouraged (Skip to Q23)

b. Additional steps are mildly encouraged (Skip to Q23)

c. Additional steps are not encouraged


Several reviewers reported not having or wanting to use a cell phone as the reason that they do not use text messaging to contact households. We recommend including this response as an option for Q26a (formerly Q28a).


26a. Please indicate why you do not use text messaging. (Please select all that apply.)

a. My manager discourages this method

b. This method is too time-consuming

c. This method requires too many resources

d. Clients may not be able to receive text messages

e. This is not an approved contact method according to State or Federal procedures

f. I don’t have or want to use a cell phone

g. Other reason


To maintain consistency among the order of responses, we suggest reordering the “Yes” and “No” responses in Q27 (formerly Q29) and Q28 (formerly Q30) to mirror similar questions in the survey.


27. When locating a household, I reach out to neighbors.

a. No Yes (Skip to Q28)

b. Yes (Skip to Q30) No


28. When locating a household, I reach out to additional collateral contacts like the U.S. Post Office, DMV, property manager, etc.

a. No Yes (Skip to Q29)

b. Yes (Skip to Q31) No


Questions 32 and 33 (formerly 34 and 35) refer to specific situations that apply to incomplete cases and ask for reviewers to quantify these situations using a percentage. Reviewers responded that incomplete cases are rare, and that to further break down the number of the cases by situation made it very difficult to describe the quantity in percent form (Theme #1: Difficulty Providing Numbers and Estimates). After reviewing the struggles that the NC SQCRs had with percentages, the question wording was changed for the DC pre-testing for questions 32, 32a, 33, and 33a (formerly Q34, 34a, 35 and 35a) to frame the question in terms of the previous month only and to request a response in numerical form, not percentage. This approach proved much more successful. We also added response options to questions 32a and 33a (formerly Q34a and 35a) to capture the incidence of never having a “refusal,” a “failure to cooperate,” or “an incomplete case,” respectively.


32. Last month, how many What percentage of your SNAP QC cases refused to cooperate with the review? (Note that we are talking specifically about refusals, not failure to cooperate.)

___% [Enter number]


32a. Since you’ve been doing SNAP QC reviews, has that number percentage increased, decreased, or stayed about the same?

a. Increased

b. Decreased

c. Stayed about the same

d. Never had a “refusal to cooperate”


33. Last month, What percentage how many of your SNAP QC cases failed to cooperate? (Note we are not talking about refusals here.)

___% [Enter number]


33a. Since you’ve been doing SNAP QC reviews, has that number increased, decreased, or stayed about the same?

a. Increased

b. Decreased

c. Stayed about the same

d. Never had a “failure to cooperate”


We propose adding a question to capture the frequency with which a reviewer used “likely conclusion” to complete a case (Theme #5: New Areas of Inquiry). Reviewers in both pre-test States recommended this addition to the survey given that “likely conclusion” is a technique used frequently in some States, and may enable some States to achieve higher completion rates than States not frequently using it. Asking this question of reviewers in all States will enable us to examine whether such a relationship exists between completion rates and the use of “likely conclusion.”


34. How often do you use “likely conclusion” to complete a case?

a. Never

b. 1-5% of the time

c. 6-10

d. 11-20

e. 21-30

f. >30% of the time




Section D: Incomplete Cases

Section D focuses on incomplete cases. As previously mentioned, incomplete cases were a very small portion of the caseloads of the reviewers we pre-tested. These reviewers found it difficult to quantify many of the aspects of their incomplete cases in terms of percentages (Theme #1: Difficulty Providing Numbers and Estimates), and also had some difficulty recalling reasons for incomplete cases. This is especially true because in several instances it has been months (or never) since a reviewer coded a case as incomplete. We recommend addressing this issue by asking for a numerical value instead of percentages and focusing on either the previous month or the most recent incomplete case. This change is recommended for Q35 (formerly Q36) and Q37 (formerly Q38).


35. On average, what percentage, Last month, how many of your SNAP QC cases did you code as incomplete?

___% [Enter number]


We adjusted Q35a (formerly 36a) to mirror the questions in the previous section to accommodate a response for the reviewer who has never had an incomplete case.


35a. Has that number increased, decreased, or stayed the same since you have been doing SNAP QC reviews?

  1. Increased

  1. Decreased

  2. Stayed the same

  3. Never had an incomplete case.


Questions 37 through 38a (formerly 38-39a) asked for the first and second most common reasons that reviewers coded cases as incomplete. Due to the small number of incomplete cases and reviewers’ difficulty with the follow-up questions regarding the percent of incomplete cases coded for this reason (Theme #1: Difficulty Providing Numbers and Estimates), we recommend simplifying the two questions by asking about the reason for the last incomplete case in a reviewer’s caseload. We will rely on aggregate answers from the approximately 225 surveys to obtain a snapshot of the reasons for incomplete cases overall.


37. What is the most common reason that cases are coded as incomplete? What was the reason that your most recent incomplete case was incomplete?

a. The case file record could not be found

b. The household could not be located

c. Failure to cooperate (e.g., the client made an initial effort but was unable to coordinate a meeting with you.)

d. Refusal to cooperate

e. The reviewer was unable to arrive at a likely conclusion

f. Has never had an incomplete case


(38a.) What percentage of incompletes fits this reason?

___%


(39.) What is the second most common reason that cases are coded as incomplete?

a. The case file record could not be found

b. The household could not be located

c. Failure to cooperate (e.g. The client made an initial effort but was unable to coordinate a meeting with you.)

d. Refusal to cooperate

e. The reviewer was unable to arrive at a likely conclusion


(39a.) What percentage of incompletes fits this reason?

___%


Section E1: Training and Tools

The range of response options in Q40 (formerly Q42) did not allow for responses when training is received less often than semi-monthly, but more often than once a year. To address this we propose adding, “training is ongoing and is conducted twice a year” to the response options.


40. How frequently do you receive training? Would you say___?

a. Training happened once, when I started the job

b. Training is ongoing and is conducted weekly, monthly, or semi-monthly

c. Training is ongoing and is conducted twice a year

d. Training is ongoing and is conducted annually

e. Training is conducted on an as-needed basis with no set schedule


We recommend changing the answer format for Q41, Q42, and Q44 (formerly Q43, Q44, and Q46) to create an easier answer format for respondents, rather than using a “select all that apply” method. This technique will prevent the interview from having to repeat the list of response options to the respondent multiple times. We also added a variable option for “Likely Conclusion” as a means of case completion, as our pre-tests indicated that this method was used for case completion often in one State.


41. From the following list, please tell me whether the following topics are covered during training: the content of the training. Please select all that apply.

a. SNAP eligibility (Y/N)

b. Procedural aspects of QC review (Y/N)

c. Interview techniques (Y/N)

d. Household location techniques (Y/N)

e. State-specific policy, including options and waivers (Y/N)

f. Likely conclusion as means of case completion (Y/N)

g. Other (Y/N)


42. What is the format of the training? Again, please select all that apply. From the following list, please tell me the format of the training.

a. Formal in-person training (Y/N)

b. Online independent tutorial or module (Y/N)

c. Online group webinar (Y/N)

d. Conference call (Y/N)

e. Informal meetings, such a staff meeting (Y/N)

f. Peer mentoring (Y/N)

g. Written materials for individual study (Y/N)

h. Other (Y/N)


In addition to changing the answer format for Q44 (formerly Q46), we propose adding “conference call” to the response options, as several reviewers cited this as a method of training.


44. From the following list, please tell me how reviewers are trained or alerted when there is a change in Federal or State policy that affects SNAP eligibility or allotment determination. Select all that apply.

a. Email alert (Y/N)

b. State manual page change (Y/N)

c. Memo (Y/N)

d. Formal training/meeting with staff (Y/N)

e. Conference call (Y/N)

f. Other (Y/N)

g. We are not notified (Y/N)


Reviewers from the DC office could not remember who from the regional office led training. We recommend addressing this by reducing the specificity of position from the regional office. The response options, “A QC reviewer from the regional office,” and “A QC director or supervisor from the regional office,” may be replaced with, “A staff member from the regional office.”


43. Who leads the instructor-led trainings?

a. A senior staff member from the State QC office (Y/N)

b. A peer QC reviewer from the State office (Y/N)

c. A QC reviewer from A staff member from the regional office (Y/N)

d. A QC director or supervisor from the regional office

e. A contractor (Y/N)

f. No instructor-led training


We included an additional question Q47 to give SCQRs the opportunity to recommend what types of training may be helpful to them.


47. Please tell me whether additional training on each topic would be helpful.

a. SNAP eligibility (Y/N)

b. Procedural aspects of QC review (Y/N)

c. Interview techniques (Y/N)

d. Household location techniques (Y/N)

e. State-specific policy, including options and waivers (Y/N)

f. Likely conclusion as means of case completion (Y/N)

g. Other (Y/N)


We propose adding “or materials” to provide some additional context to Q52 (formerly Q53). This prompt was helpful to a DC reviewer who was unsure what Q52 (Q53) was asking.


52. Are there additional tools or materials (either paper or electronic) available to you when you are conducting SNAP QC reviews?

a. Yes

b. No


Section F1: Attitudes Toward Completion Rates


We received a broad range of answers for former question 54, “What do you think is an achievable target for your monthly completion rate?” Although the national completion rate is just under 93 percent, some reviewers gave us responses for an achievable target as low as 40 percent and as high as 100 percent. We felt this range of answers illustrated that reviewers may not be informed of the definition of completion rates and that this question may be of greater value at the director and supervisor level. As a result, we struck former question 54 from both the SQCR survey and the semi-structured interview.


(54.) What do you think is an achievable target for your monthly completion rate?

Enter %____


Section F1: Accountability

To facilitate the flow of the interview we propose adding the following preface to the questions in Section F1:


Next, I’m going to ask you a few questions about your attitudes and opinions about doing SNAP QC reviews.


To facilitate a more efficient response time, we propose adding the following preface to questions 53-65:


For the each of the following statements, please tell me whether you strongly agree, agree, disagree, or strongly disagree.

appendix A.2.
STATE DIRECTOR/SUPERVISOR SURVEY

The DC director survey took 30 minutes to complete. The survey was administered after the Insight team discussed adding language to the introduction about maintaining the allotted timeframe and allowing for comments at the end of the survey, in order to address the concerns about the duration of the instrument. The following language was added to the instructions prior to this pre-test:


The survey format includes multiple choice and short answer questions to keep the survey as brief as possible. There is an opportunity at the end of the survey to expand on responses that cannot be summarized through the multiple-choice options.


Section B: SNAP QC Caseload

Question 6 mirrors the question from the SQCR survey regarding the number of cases assigned to reviewers last month. In order to remain consistent with the changes to the SQCR survey, we recommend adjusting the language in the director survey prior to the pre-test, focusing on the last month only. The language also reflects the average case assignment, given what we learned about different levels of reviewers affecting case assignments (Theme #1: Difficulty Providing Numbers and Estimates). We propose omitting questions 6b-d.


6. Approximately, how many SNAP QC cases were assigned to any one reviewer at a time last month?

___ [Enter number]


6a. Of these cases, how many were assigned during the current month how many were:

___ [Enter number] Active

___ [Enter number] Negative

___ [Enter number] Other types of reviews (other than SNAP QC)


(6b-d.) How many were assigned (6b) the previous month/ (6c) 2 months ago/ (6d) 3 months ago?

___ [Enter number] Active

___ [Enter number] Negative

___ [Enter number] Other types of reviews (other than SNAP QC)


Similar to the reviewer responses, the director found it difficult to describe typical procedures without describing how cases vary due to homelessness. DC is unique in that it is the only State that is completely urban, but we may want to assume that a director in a rural or suburban State may make similar comments regarding regional demographics within the State. To address this situation in the SQCR survey, we propose asking each reviewer to indicate the proportion of his or her caseload that is homeless and the proportion of his or her caseload that is rural, urban, suburban, or a mix of these categories (Theme #2: Variation in Case Characteristics and Circumstances). While we considered a similar approach in the director survey to measure the demographics of the State, we understand that the State director may not be privy to the specifics of the average reviewer’s caseload makeup in terms of demographics or location, particularly in States where reviewers are home-based or otherwise away from the main State office. As a result, details about State regional demographics are not proposed for the director instruments.


Section C1: Training

We recommend changing the answer format for Q10a, Q10b, and Q16 (same question numbers in current and former versions) to create an easier answer format for respondents, rather than using a “select all that apply” method. This technique will prevent the interview from having to repeat the list of response options to the respondent multiple times. We also added a variable option for “Likely Conclusion” as a means of case completion, as our pre-tests indicated that this method was used for case completion often in one State.


10a. From the following list, please tell me whether the following topics are covered during training: the content of the training. Please select all that apply.

a. SNAP eligibility (Y/N)

b. Procedural aspects of QC review (Y/N)

c. Interview techniques (Y/N)

d. Household location techniques (Y/N)

e. State-specific policy, including options and waivers (Y/N)

f. Likely conclusion as means of case completion (Y/N)

g. Other (Y/N)


10b. What is the format of the training? Again, please select all that apply. From the following list, please tell me the format of the training.

a. Formal in-person training (Y/N)

b. Online independent tutorial or module (Y/N)

c. Online group webinar (Y/N)

d. Conference call (Y/N)

e. Informal meetings, such a staff meeting (Y/N)

f. Peer mentoring (Y/N)

g. Written materials for individual study (Y/N)

h. Other (Y/N)


In addition to changing the answer format for Q16 (same question number in current and former versions), we propose adding “conference call” to the response options, as several reviewers cited this as a method of training.


16. From the following list, please tell me how SNAP QC reviewers are trained or alerted when there is a change in Federal or State policy that affects SNAP eligibility or allotment determination. Select all that apply.

a. Email alert (Y/N)

b. State manual page change (Y/N)

c. Memo (Y/N)

d. Formal training/meeting with staff (Y/N)

e. Conference call (Y/N)

f. Other (Y/N)

g. We are not notified (Y/N)


The range of response options in Q13 (same question number in current and former versions) did not allow for responses when training is received less often than semi-monthly, but more often than once a year. To address this we propose adding, “training is ongoing and is conducted twice a year” to the response options.


13. How frequently do SNAP QC reviewers receive training? Would you say___?

a. Training happens once, when they start the job

b. Training is ongoing and is conducted weekly, monthly, or semi-monthly

c. Training is ongoing and is conducted twice a year

d. Training is ongoing and is conducted annually

e. Training is conducted on an as-needed basis with no set schedule


Section C2: Tools

We propose adding “or materials” to provide some additional context to Q21 (same question number in current and former version). This prompt was helpful to a DC reviewer who was unsure what Q21 was asking.


21. Are there additional tools or materials (either paper or electronic) available to SNAP QC reviewers when they are conducting the reviews?

a. No

b. Yes


Section D2. Procedures

The director/supervisor semi-structured interview instrument asked a question about engaging the use of a third party consultant. During the pre-test, we found that this question yielded valuable information. We added this question to the director/supervisor survey for consistency and to ensure that we collect this information from all States.

34. Has your State ever engaged an outside party (e.g. consultant) to review QC policies and procedures and recommend changes?

  • No

  • Yes

34a. How effective was this?

  • Very effective

  • Somewhat effective

  • Only a little effective

  • Not at all effective



Section E: Incomplete Cases

Similar to the SQCR survey, the director found the wording for Q41 (formerly Q40) confusing. To address this, we recommend changing the wording to be consistent with the revision in the SQCR survey.


41. In the event a household cannot be located, to what extent are SNAP QC reviewers free to take additional steps beyond the FNS 310 Handbook instruction that a minimum of two follow up attempts should be made with collateral contacts? Would you say that___? The FNS 310 Handbook mentions that reviewers must make a minimum of two follow-up attempts to contact a household. To what extent are QC reviewers in your State encouraged to go beyond that two-step minimum?

a. Additional steps are strongly encouraged (Skip to Q42)

b. Additional steps are mildly encouraged (Skip to Q42)

c. Additional steps are not encouraged


The director was unable to answer the former questions Q41-41a, questions asking about the reasons that reviewers could not locate households. While the DC director did not know this level or detail, the NC director was able to cite several reasons that align to the responses in this question, so we feel that the DC director’s response may not be representative of all State QC directors. However, in the interest of both accuracy and time, we felt it prudent to recommend removing these questions from the director instrument.


(41.) What is the most common reason SNAP QC reviewers cannot locate a household?

a. The household has moved and left no forwarding address

b. The household is homeless or did not otherwise have a reliable fixed physical location

c. The address is fake or incorrect

d. The household does not have a working phone

e. The household is avoiding contact (but not actively refusing to cooperate)

f. Collateral contacts are not cooperative

g. Other

h. Do not know why (Skip to Q43)


(41a.) What percent of the time do you think this reason accounts for not being able to locate a household?


(42.) What is the second most common reason QC reviewers cannot locate a household?

  1. The household has moved and left no forwarding address

  2. The household is homeless or did not otherwise have a reliable fixed physical location

  3. The address is fake or incorrect

  4. The household does not have a working phone

  5. The household is avoiding contact (but not actively refusing to cooperate)

  6. Collateral contacts are not cooperative

  7. Other

(42a.) What percent of the time do you think this reason accounts for not being able to locate a household?

____%


Section F1: Attitudes toward Completion Rates

We pre-tested a question to both SCQRs and directors regarding what was perceived as an achievable monthly completion rate. In Appendix A.1 of this memo, we discuss removing this question from the SQCR instruments because we found this this level of knowledge was more appropriately answered by senior-level staff. During the pre-test, we noticed confusion surrounding the reference to a “monthly” completion rate, since 1) completion rates are often referred to as an annual calculation and 2) “monthly” was interpreted as both a 30 day time period and as a full review period (115 days). To eliminate confusion, we removed the time reference in this question. Q42 (formerly Q43) now reads:

42. What do you think is an achievable target for your office’s monthly SNAP QC completion rate?

Enter %_____

Section F5: Perception of Challenge

The DC director did not know the answer to the former question 57. This question assumes that directors have access to details about incomplete cases from other States and this assumption may be faulty. To address this, we suggest omitting the question.


(57.) There are more clients in my State who fail to cooperate with the SNAP QC process than in most other States.

a. Strongly Agree

b. Agree

c. Disagree

d. Strongly Disagree


appendix A.3.
SQCR SEMI-STRUCTURED INTERVIEW

Introduction

As described in the memo, our greatest challenge in conducting the semi-structured interview was keeping the interview focused on the questions in the instrument and within the targeted timeframe. While the instrument is semi-structured by design and we did not want to discourage relevant stories related to the QC process, we found that respondents frequently deviated from the topic and had to be directed back to the question at hand. To create a general expectation of a schedule and the topics that we plan to cover in the interview, we recommend emphasizing in the introduction to the interview that we expect the interview to take about 60 minutes and describing the list of topics to be covered in the discussion.


The proposed changes to the interview introduction are written in italics below:


Introduction: Thank you for taking the time to talk with me today. I’d like to tell you a little bit about why we are doing these interviews. We are talking to State agencies that conduct SNAP Quality Control (QC) reviews to help FNS understand the factors that lead to incomplete SNAP QC cases and recommend ways to improve case completion rates in the future. In particular, we are interested in learning more about the incomplete cases, not the cases deemed Not Subject to Review. I will be asking you some questions to assess the process. We expect this interview to take about an hour to complete. We have a lot to talk about, so I’m going to list for you the different sections of this interview. We’ll talk about your experiences as a QC reviewer, the SNAP QC caseload, the SNAP QC review procedures in your State, and then we’ll go into some details about incomplete cases. After that, we’ll talk about your State’s relationship with the Regional office, tools and materials you use during your reviews, as well as the training you received, and then we’ll talk briefly about challenges and solutions involved in the QC process. We’ll wrap up the interview with a short questionnaire about attitudes related to SNAP QC completion rates. If it’s okay with you, I would like to record the interview so that I don’t miss anything. Is that all right?


Section B: SNAP QC Caseload

To address the issue of varying roles of QC reviewers (Theme #3: Variation in SQCR Roles), we propose adding the following interviewer instructions at the beginning of this section:


Interviewer note: On the occasion that you interview a reviewer who has multiple responsibilities at the QC office (i.e., second-level analyst, or other position that is not strictly a reviewer), please indicate that for these questions, we are only interested in responses that pertain to SQCR responsibilities.


We noticed similar difficulties with the question wording regarding the number of reviews assigned to reviewers each month. We recommend changing the introductory question of this section to reflect the number of reviews assigned last month, rather than asking for an estimate of reviews from each of 3 prior months (Theme #1: Difficulty Providing Numbers and Estimates). This change in language will decrease guessing and estimation, as well as shorten the interview to keep within the intended timeframe.


Approximately, how many SNAP QC cases were assigned to you at a time last month?

___ [Enter number]


Of these reviews, how many were assigned during the current month how many were:

___ [Enter number] Active

___ [Enter number] Negative

___ [Enter number] Other types of reviews (other than SNAP QC)


As with the survey, we feel that it is in the interest of time management to remove the questions that ask reviewers for the number of cases assigned to them each month for the last 3 consecutive months (Theme #1: Difficulty Providing Numbers and Estimates). When this question was pre-tested using the survey, respondents had to stop the survey to access their files to retrieve numbers. To avoid disruptions to the interview, we propose eliminating these questions altogether. [As a follow-up to, “Approximately how many reviews were assigned to you last month?”]


___Active SNAP QC reviews per month; ___# of cases for current month assignment; ___# of cases from previous month assignment: ___# of cases from 2 months ago assignment; ___# of cases from 3 months ago assignment


___Negative SNAP QC reviews per month; ___# of cases for current month assignment; ___# of cases from previous month assignment: ___# of cases from 2 months ago assignment; ___# of cases from 3 months ago assignment


___Other types of reviews (other than SNAP QC) per month; ___# of cases for current month assignment; ___# of cases from previous month assignment: ___# of cases from 2 months ago assignment; ___# of cases from 3 months ago assignment


We suggest adding the following questions to address the ongoing concern that many of the procedural and incomplete case-related questions did not account for the situations that arose from contacting and locating households that were homeless, rural, or urban (Theme #2: Variation in Case Characteristics and Circumstances). Adding these descriptive caseload questions will allow the analysis team the opportunity to compare reviewer responses to certain household demographics.


In geographic terms, what is the type of population you most often work with (i.e., urban, rural, suburban, mix)?


How much of your caseload is homeless?


Section C: State SNAP QC Review Procedures

The semi-structured interview pre-test provided detailed information about some of the safety concerns that both reviewers and households face when conducting reviews and arranging interviews. We recommend adding questions to both SQCR instruments to capture reviewers’ experiences regarding safety concerns faced by both reviewers and clients (Theme #5: New Areas of Inquiry). We propose adding the following question to the State SNAP QC Review Procedures section:


How do your safety concerns play into contact and interview procedures?


Do SNAP households express concerns about safety that may affect the review process?


We revisited questions in the semi-structured interview that asked for reviewers to quantify reasons for incomplete cases using a percentage. Reviewers responded that incomplete cases are rare, and that to further break down the number of the cases by situation made it very difficult to describe the quantity in the form of a percent (Theme #1: Difficulty Providing Numbers and Estimates). In order to reduce confusion regarding the use of percentages and to maintain consistency with the survey, we propose the following changes:


Has the number percentage who fail to cooperate increased, decreased, or stayed pretty constant since you’ve been doing SNAP QC reviews? [If increased, why do you think that is the case?]


Section D: Incomplete Cases

Similarly, we propose the following change to the question about incomplete cases:


Approximately what percentage of how many incomplete cases fit each of those reasons?


We propose adding the following question about “likely conclusion” to capture the frequency of which a reviewer uses “likely conclusion” to complete a case (Theme #5: New Areas of Inquiry). Reviewers in both States recommended this addition to the survey given that “likely conclusion” is a frequently used technique in some States. We recommend this addition to both the SQCR survey and the semi-structured interview.


How often do you use likely conclusion to complete a case?

Section G: Training

We added the following question to mirror the language in the SCQR survey and to provide SQCRs the opportunity to comment on which types of additional training may be useful.


From the following list, please tell me whether additional training on each topic would be helpful.

    • SNAP eligibility

    • Procedural aspects of QC review

    • Interview techniques

    • Household location techniques

    • State-specific policy, including options and waivers

    • Use of likely conclusion

    • Other, please explain.

appendix A.4.
STATE DIRECTOR/SUPERVISOR SEMI-STRUCTURED INTERVIEW

As described in the memo, our greatest challenge in conducting the semi-structured interview was keeping the interview focused on the questions in the instrument and within the targeted timeframe. While the instrument is semi-structured by design and we did not want to discourage relevant stories related to the QC process, we found that respondents frequently deviated from the topic and had to be directed back to the question at hand. To create an expectation of a schedule and the topics that we plan to cover in the interview, we recommend we email each State office prior to the scheduled interview a list of topics we will cover in the interview. We will also stress in this email and the introduction to the interview that we expect the interview to take about 60 minutes to complete.


The proposed changes to the interview introduction are written in italics below:


Introduction: Thank you for taking the time to talk with me today. I’d like to tell you a little bit about why we are doing these interviews. We are talking to State agencies that conduct SNAP Quality Control (QC) reviews to help FNS understand the factors that lead to incomplete SNAP QC cases and recommend ways to improve case completion rates in the future. In particular, we are interested in learning more about the incomplete cases, not the cases deemed Not Subject to Review. I will be asking you some questions to assess the process. We expect this interview to take about an hour to complete. If it’s okay with you, I would like to record the interview so that I don’t miss anything. Is that all right?

Section B: SNAP QC Caseload

We replaced the wording for questions regarding caseload to mirror the wording in the SQCR instruments, following the same reasoning that reflected in Theme #1: Difficulty Providing Numbers and Estimates.


How many cases are assigned to any one reviewer at a time?


___Active SNAP QC reviews per month; ___# of cases for current month assignment; ___# of cases from previous month assignment: ___# of cases from 2 months ago assignment; ___# of cases from 3 months ago assignment


___Negative SNAP QC reviews per month; ___# of cases for current month assignment; ___# of cases from previous month assignment: ___# of cases from 2 months ago assignment; ___# of cases from 3 months ago assignment


___Other types of reviews (other than SNAP QC) per month; ___# of cases for current month assignment; ___# of cases from previous month assignment: ___# of cases from 2 months ago assignment; ___# of cases from 3 months ago assignment


Have workloads changed over the last 5 years? How? Why?


Approximately how many SNAP QC reviews were assigned to a typical reviewer last month?

______ [Enter number]


Of these reviews, how many are:

____ [Enter number] Active

____ [Enter number] Negative

____ [Enter number] Other types of reviews (other than SNAP QC)


Section C: Training

The first question in Section C asks what types of training are provided to teach reviewers to conduct SNAP QC reviews. Two sections of follow-up questions probe for details about the topics covered in training and who conducts the training. These questions are included in all other instruments, including the SQCR survey and semi-structured instruments. We propose removing these questions to shorten the director interview and to reduce redundancy in data collection.


What types of training are done to teach QC reviewers how to conduct SNAP QC reviews?

What topics are covered in this training? Certification? How to locate and contact a household? How to locate and work with collateral contacts? How to conduct the field interview? How and when to code a case as incomplete? State policy related to certification and allotment, such as waivers and options? Something else?

Who conducts the training(s)? Is the training online, in-person by instructor, or both? For instructor-led training, is the instructor a senior staff QC reviewer or contractor? A regional QC staff member? Someone else?


When do SNAP QC reviewers receive training? (When they start the job? On an ongoing basis? When needed? When something new comes up? Annually?)

Has the amount of training changed over time?

Has the content of training changed over time?


How are reviewers trained when there is a change in Federal or State policy that affects eligibility or allotment determination?


Section D: QC Review Procedures

With the exception of one question, we felt that the questions asked in the “QC Review Procedures” section and the questions asked in the SQCR semi-structured interview were redundant. Having spoken with six SQCRs during the pre-testing process, we found that the level of detail shared with the research team regarding procedures, techniques for review completion, and barriers to contacting and locating households was far more detailed when provided by a reviewer than a director. Both directors we spoke to during the pre-test process gave us generalizations about the review process. One director was unable to answer several survey questions pertaining to reviewer techniques and deferred to the reviewers to provide us with responses. As shown below, we propose dropping the “QC Review Procedures” section from the director/supervisor semi-structured instrument in favor of allowing more time for questions specific to the director and supervisor experience (Theme #4: Focus on Responsibilities Unique to Respondents’ Duties).


How many days do reviewers have to review a case?

What happens to the review if it is not completed in that time?

Are there any interim deadlines established for completing the reviews?


On average, how long does it take to conduct a SNAP QC review (from the time a case is assigned to a reviewer to the time the outcome of the review is transmitted)?

Has this time increased, decreased, or stayed the same over time? Why?


What are the general steps taken by SNAP QC reviewers to locate and contact households selected for review? (Probe if necessary: Mail letter? Use certified mail? Call landlines or cell phones? Use email? Use text messaging? Use internet? Reach out to neighbors or other collateral contacts? Talk to building managers? Talk to the post office?)


Have these steps changed over time? How? Why?


On average, how long does it take to locate and contact a household?

Has this time increased, decreased, or stayed the same over time? Why?


What problems do QC reviewers have locating and contacting households?


Are there any additional strategies (outside of the ones just mentioned) that interviewers use to locate and contact households?

What is the most effective strategy?


What strategies are used to convince households to cooperate with the review? What is the most effective strategy? (Probe if necessary: Offer flexible times to meet, including evenings and weekends; offer alternative locations to meet; notify the household that failure to cooperate could affect their benefits; enlist the help of the caseworker; offer to help with childcare)


What are the reasons that households fail to cooperate? Just to clarify, I’m specifically interested in households that fail to cooperate, not those that refuse to cooperate.


Have you seen an increase in the number (or percent) of households that fail to cooperate? If so, why do you think that is the case?


What about the cases that refuse to cooperate—have they increased, decreased, or stayed pretty constant since you’ve been doing SNAP QC reviews? [If increased, why do you think that is the case?]


What are the general steps taken to conduct the field interview once the household agrees to participate? (Probe if necessary: Where do you typically meet? How long does it take? What questions do you ask? What documentation do you request? Any other steps?)

Are the same steps taken for every case?

Have these steps changed over time? How? Why?


Section H: Challenges and Solutions

The issue of consultants arose during the SQCR semi-structured interview pre-test. We think this may be a topic worth pursuing with other States as well, and the director may be the most appropriate respondent given their responsibilities and duties. We propose adding the following question to Section H on “Challenges and Solutions.”


Has your State ever engaged an outside party (e.g. consultant) to review QC policies and procedures and recommend changes? How effective was this?


1 The questions and responses that we report upon in this memo relate to tasks specific to SNAP QC only and not to other QC programs. Although SQCRs participating in this pre-test conducted QC reviews only for the SNAP program, SNAP QC reviewers in other States may conduct reviews for other programs as well, such as child support or Medicare/Medicaid. The data collection instruments, however, ask SQCRs to distinguish whether they have QC tasks other than SNAP QC.

2 Revisions resulted in changes to question numbers. This memo refers to questions using their current numbers, unless it is necessary to refer a question that was revised. In these cases, we refer to the revised question with the current question number and the former number in parentheses.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authordhaddix
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy