Attachment C 2-Response to NASS_9-26-2013

OMB-SNAP QC-Attachment C 2-Response to NASS_9-26-2013.docx

Enhancing Completion Rates for SNAP(Supplemental Nutrition Assistance Program) Quality Control Reviews

Attachment C 2-Response to NASS_9-26-2013

OMB: 0584-0590

Document [docx]
Download: docx | pdf







U.S. Department of Agriculture

Food and Nutrition Service







Enhancing Completion Rates for SNAP

(Supplemental Nutrition Assistance Program) Quality Control Reviews



Request for Clearance

Supporting Statement and

Data Collection Instruments


Attachment C.2:

Response to the Review by the

USDA National Agricultural Statistics Service (NASS)


Project Officer: Robert Dalrymple










September 26, 2013




Shape1

OMB Control Number: 0584-XXXX
Expiration Date: XX/XX/XXXX






Attachment C.2:
response to the review by the usda national agricultural statistics service (NASS)


June 26, 2013


MEMORANDUM



To:

FNS NASS

Bob Dalrymple, Project Officer, FNS

Lynette Williams, PRAO Branch Chief, FNS



From:

Stéphane Baldi, Executive Project Director, Insight

Brittany McGill, Deputy Project Director, Insight

Subject:

Enhancing Completion Rates for SNAP QC Reviews: Response to NASS Comments





This memorandum provides Insight’s response to the NASS comments on the OMB package prepared for the FNS study, “Enhancing Completion Rates for SNAP Quality Control (QC) Reviews.” We have reviewed these comments thoroughly and have organized our responses along three areas: 1) areas where we agree with the comments and can address them accordingly; 2) responses to specific questions posed in the review; and 3) areas where we believe the reviewer may have misunderstood certain aspects of the study and its methods.


  1. Areas of Agreement


The review raised several questions related to Part A and Part B of the OMB package. We agree with the reviewer that the language describing the justification for the study can be clarified. The reviewer’s understanding of the justification for the study is reasonably summarized on Page 2 and Insight proposes to revise this portion of the supporting statement accordingly to better clarify these points.


In the comments on Part B, the reviewer questions whether staff who have recently left their positions should be included in the study frame. We agree with these concerns and will revise the text of the supporting statement to clarify that we do not plan to include these individuals in the frame.


  1. Responses to Specific Questions


In addition, the review raises several questions, to which we provide the following responses:


The first paragraph of “Part A—Other Questions” questions whether the QC re-review component of the study is adequately covered under a previous OMB approval. The materials used during the re-review process are the same as those used by current SNAP QC reviewers, and no additional information will be asked of respondents. As a result, FNS has agreed that the previous OMB approval applies to the current study.


The reviewer also notes in this section that burden on State SNAP offices extracting the extant administrative data is not reflected in the burden estimates. We were not previously aware this needed to be included in the burden calculation, but we will incorporate it.


The reviewer asks how the response rate estimates were produced. The number of respondents was estimated by assuming a 40 percent response rate to the initial web survey request, a 40 percent response to the first follow-up phone call, and 20 percent responses after each subsequent contact. Rounding to whole person numbers and dividing the number of estimated respondents by the total in the frame resulted in estimated responses rates of 81, 82, and 83 percent for the directors, supervisors, and reviewers, respectively.


  1. Areas of Misunderstanding


We believe that some comments in the review suggest the reviewer may have misunderstood various aspects of the study, its purposes, and its methods, or may have been unfamiliar with the SNAP QC review process.


Much of the discussion on the first page focuses on questions about the random sample drawn for QC review. However, the current study does not draw, or even use, this sample; rather, the random samples drawn by each State for QC review are part of the SNAP QC system and are not a part of this research. The current study involves using a mixed methods approach including interviewing and surveying State and federal QC staff, analyzing extant administrative data on QC cases, and re-reviewing a small number of the most recently reviewed incomplete QC cases in 3 States (up to 25 cases in each of 3 States).


Second, the reviewer poses several questions about the changes in completion rates over time and about the time frame of the study. This study focuses on the current time period and does not analyze the trend in completion rates dating back to 1985. The trend in completion rates, namely a general decline over time with smaller increases in more recent years, was described to provide context to the study in the justification section and to highlight the need to understand what factors may contribute to incomplete QC reviews and how completion rates may be improved.


The goal of the study is to gain a better understanding of the SNAP QC review process along with identifying strategies that may improve the completion of cases and thus overall completion rates. This contrasts with the study goals described by the reviewer focusing on measurement of bias and validity of estimates, as described at the bottom of page 2. As mentioned above, this study takes a mixed methods approach to addressing the study goals, with a relatively stronger emphasis on qualitative data collection.


Finally, we believe the reviewer may have misunderstood some aspects of the data collection instruments, particularly the semi-structured interview protocols. One issue identified in the review indicates concern with asking multiple questions at once in the semi-structured interviews and consequently overtaxing the respondent’s cognitive processes. These questions are not intended to be asked all at once. Rather, there is one primary question, and suggested follow-up questions are listed to probe for further information if not provided in the initial response. The interviews are designed to be semi-structured and flexible, and the interview protocols are intended to provide a general guide for the conversation rather than script an interview verbatim. This approach is widely used in qualitative data collection, and Insight has established expertise in this mode of data collection.


Further, and to address many of the points raised by the reviewer related to the study instruments, we would like to clarify that the data collection instruments were pre-tested with respondents in the study frame, namely State and federal QC staff who have a thorough understanding of the QC process. Insight analyzed the results of the pre-tests and made revisions to the instruments based on these participants’ feedback. As a result, we believe the instruments to be valid and the questions to be worded appropriately to elicit the type of information sought.


We look forward to scheduling a conference call to discuss the above points in greater detail should it be necessary.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorYaeko Tise
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy