Appendix D1
Memo on Pretest Results
The purpose of this memo is to summarize findings from pretests conducted for the various instruments that will be used for the WIC Nutrition Services and Administration (NSA Cost Study). The pretests were conducted between May 7 and May 31 of 2013, and involved both State and local WIC staff. State and local agencies participating in the pretest were matched with the type of instrument most appropriate for their type of state or local agency. The instruments included in the pretest were the State and local agency Web surveys; the combined Web survey for State-run local programs; State and local WIC case study guides; and the SNAP/TANF case study guide.
Each of the pretest respondents was provided with either a copy of the instrument (case study guide) or a Web link by which they could access the Web-based survey. They were asked to review the questions for both comprehension and ability to answer the questions (using FFY 2012 data). In addition, they were encouraged to discuss the level of difficulty or ease of obtaining the information needed for answering the questions and to estimate the amount of time required to collect the information. Information from the pretest was combined with input provided by the Peer Advisory Panel (PAP) to create the final instruments that will be submitted to FNS. The list of State and local agencies participating in the pretest include the following:
State and Local WIC Agencies Used For Pretest |
|
State Web Survey |
Kansas and Minnesota |
Local Web Survey |
Arizona, Kansas, and California |
Combined Web Survey |
South Dakota |
State and Local Case Study Guides |
Maryland and Tennessee |
SNAP TANF Case Study Guide |
Maine |
This memorandum is organized into five sections. The first section will discuss the high level findings from both the PAP input and the pretest, particularly those related to the ability of a State or local WIC agency to answer the Web survey questions, the format and flow of the instruments, and the overall organization of the case study guides. The second section discusses specific pretest input for the State and local agency Web survey, while the third section discusses input on the “combined” Web survey. The fourth section discusses findings related to the case study guides for State and local WIC agencies, and the final section discusses the input provided on the SNAP/TANF case study guide.
Section I: High Level Findings
Ability of local WIC agencies to complete the local agency Web-based surveys and answer all questions.
Overall, the PAP and the pretest respondents felt that the information being asked could be provided, with some notable exceptions. The greatest challenge was faced by local agencies when trying to provide the required budget detail by the four cost categories required for the 798-A report. The local agencies, and state agencies administering local programs, pointed out that there are multiple methods approved by FNS for calculating the distribution of costs across the four categories on the 798-A form. Many of the respondents reported that they use a point-in-time time study to calculate a percentage of all program personnel costs across the four categories, the percentage of which is then applied to their total labor expenditures for the year. For example, local agencies using this method would require staff time studies for a one month period, determine the percentage of staff time devoted to each of the four categories, and then report the dollar value for each of the four categories using this percentage. This provides the State with an aggregate distribution of expenditures across the four categories that can be used for closeout reporting. Other local agencies use continuous time reporting, so that the expenditures are tracked by the four categories all year. A third method noted was that some of the local agencies conduct time studies (in this case, one third of all local agencies conduct time studies each year) and that percentage is applied to all local agencies’ bottom line expenditures.
Separate from this calculation of aggregate costs by the four categories, local agencies also provide their State Agencies with detailed contract or grant closeout expenditures by budget category. These contract or grant expenditure reports list, such details as type of staff; amount billed during the year; detailed breakout of other direct costs itemized by budget type; and total indirect costs. The Web survey for local agencies asks for this level of detailed expenditures but also asked for them to further delineate the detailed information into the four 798-A categories. Since these detailed expenditures are not tracked by the local agency across the 798-A categories (they are rolled up through the time study allocation),local agencies are unable to distribute detailed individual staff or other costs across the four categories other than to apply the same percentage derived from the time study.
Since Altarum Institute (Altarum) will be collecting the 798-A backup data for local agencies from each of the State Agencies, we will have the data necessary to calculate the percentage of all program costs by the four 798-A categories. While we still need to capture detailed expenditure information, it is unnecessary to ask the local agencies to provide the 798-A breakout of these detailed expenditures via the Web survey since we will already have obtained the same information they would use to do such. As a result, by eliminating the four categories on the detailed expenditure tables in the Web survey, the burden for local agency staff is significantly reduced. Altarum will calculate the distribution across the four 798-A categories for all funds expended in FFY2013 and reported in the Web survey breakdown of program costs.
Organization of the case study guides needed improvements to create a better flow and reduce burden.
The case study guides required more time to administer than we had originally believed to be necessary. The major reasons for this expanded time being needed were: 1) the organization of the questions did not lend themselves to an even flow of discussion, thus creating some confusion on the part of the respondents; 2) some of the information was available from other sources that could be obtained in advance of the interview; it was determined that some questions need not be asked during the interview, but rather collected prior to the site visit; and 3) we found that a lack of clarity for some questions resulted in our spending more time explaining the intent of the questions rather than discussing the answers provided.
This time factor and lack of clarity has been resolved in three ways. First, we have utilized the feedback from the pretests and will ask for some information in advance, such as organizational charts and geographic location of local agencies. Second, we are able to eliminate the need for several questions as there was redundancy in some of the questions we ask in the State and local Web surveys. Third, to resolve clarity and organizational problems; we will reorganize the case study guides by grouping questions better into logical topic groupings that make more sense to the respondents. In addition, the purpose of the question will be clearly stated prior to the discussion—so that less time will be spent explaining what each question is looking for in the way of response—and will allow us to keep the discussion focused on a particular topic. This will improve the flow, reduce burden, and cut down on the time needed to conduct the interview to about one hour.
Publicizing the study prior to FFY 2013 closeout, and helping identify in advance the data sources needed for the study, will help prepare the State and local agencies for participating in the study.
It was made very clear to us by all the respondents that, while most of the questions could be answered, it took a great deal of time on the part of both the State and the local agencies to figure out which data were needed and where they would need to look to find it. This is because some of the questions would naturally be answered by accounting staff while other questions would be answered by the program director or their staff. The respondents requested the following be considered:
Initial publicity around this study should go to the State and local agencies around FFY 2013 closeout (October 2013) so that they are all aware that information being collected for closeout will be needed for this study, and that they should plan to keep their closeout files and documentation well organized and available to be able to easily find information needed for the study.
So State and local agencies are well prepared to respond, Altarum should provide examples of the type of information that will be needed and where State and local agencies might obtain that information prior to the study data collection period. Altarum believes that these are reasonable requests, and will improve the response rate and reduce burden. We will therefore propose to distribute the first detailed publicity about this study to the States in and around October 1, 2013. This publicity will include the introduction letter and the study brochure, to help prepare the State and local agencies.
The SNAP/TANF case studies will require only a single case study guide, but will likely have to involve multiple respondents.
The questions being asked in the SNAP/TANF case study guides were reported as being straightforward and not difficult for respondents to answer. The pretest respondents indicated that much of the information was likely available and the questions were easily understood. In addition, it was noted that the same questions that would be asked of the State Office regarding the split between policy, MIS, benefit compliance and service delivery functions would also be appropriate to ask at the county level (in county-run programs), so two instruments were unnecessary.
The most significant problem with the SNAP/TANF case study guides is the number of respondents needed to answer all the questions. SNAP and TANF programs are organized much differently than WIC programs. First, there is the overall method of how program administration is organized, either a completely State-run program with districts or regions, or a county-run program where the State acts as the policy and compliance oversight agency and all local services are administered by counties. Second, there is much greater separation of duties. For example, SNAP and TANF policy offices are separate entities from entities responsible for local service delivery with staff responsible for each often located in different offices. Additionally, some SNAP and TANF functions related to policy and special programs are run as separate programs, local services are more integrated (since most states have joint applications and certification processes). The distinction between the programs at the local service delivery level is blurred, in that a person applying for services is often eligible for multiple programs, and he/she is served by a single caseworker who is knowledgeable about all programs and services provided.
Finally, budgeting and accounting activities for these programs may be handled by a completely separate organization that processes payments, conducts letter of credit draw downs, completes financial reporting for federal agencies, and prepares cost allocation plans to divide up budgets between programs.
To address this issue, it was recommended that, once the case study states are identified, and prior to scheduling visits, Altarum contact the State-level person responsible for policy for the programs, provide them with copies of the case study guide, and discuss the purpose of the case studies and who best will respond to the different questions. An alternative to this approach is to contact the FNS Regional Offices or the Agency for Children and Families (ACF) Regional Office to ask them to help identify the best person to contact in each state. Once a State-level person is identified for initial contact, the State-level person can then identify a point-of-contact (themselves or another person) who will be responsible for identifying and coordinating respondents. Then, a complete list of respondents can be developed and, with the help of the point-of contact, individual interviews can be scheduled. The point-of-contact can help to identify which sections of the case study guide will be asked of each person being interviewed. If a local county office is to be visited, the state point-of-contact can help identify a local point-of-contact to provide the same role. This approach will reduce burden and make the best use of everyone’s time.
While some states may require only a few respondents, others may be diversified enough to require multiple respondents. A preliminary list of the individuals that might be needed to complete the case study questions in the most diversified State agencies included:
The persons responsible for SNAP and TANF policy development and communication;
The person responsible for all local certification and participant management responsibilities;
The person responsible for the MIS, EBT, and management reporting systems;
The person responsible for fraud detection and control; and
The person responsible for adjunct programs, such as employment and training, child care, and transitional services.
To reduce the amount of time needed to conduct the case studies, some of the information that was initially proposed will be eliminated (as being unnecessary to answer the research questions). In addition, much of the time was used to capture features of the SNAP/TANF programs that can be obtained either through a checklist administered when the visit is scheduled or through an e-mail exchange with program administrators. Therefore, we will eliminate questions related to program features and replace the questions with a pre-visit check list. This will allow the interviewer to spend more time asking about programs that are in place and narrow the scope of the interview prior to making the visit.
State Agency and Local Agency Web Survey
Pretests of the state and local web surveys were conducted May 7 through May 17, 2013 using telephone interviews. The following tables provide information for each section of the Web survey, including the purpose of the pretest questions, the feedback obtained, and the nature of the changes made to the instruments.
1. State Agency Web Survey
Section 1: Time and Effort for Completion of State Agency Survey |
Purpose: In this section we assessed the average amount of time and effort each respondent would spend gathering necessary information and inputting that information into the survey. We will use this information to inform revisions to reduce burden and will serve as a basis for burden estimates in the OMB submission.
Feedback:
Changes:
|
Section 2: Survey Content |
Purpose: In this section we assessed whether all survey sections are clear and understandable and whether agencies are able to provide all requested information. We specifically asked respondents to identify questions that were unclear, cost categories that were imprecise, inappropriate or missing, and areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information.
Feedback:
Changes:
|
Section 3: Usability |
Purpose: In this section we assessed the ease for respondents to read screens and navigate the survey. We will use this information to improve the usability of the survey.
Feedback:
Changes:
|
Section 4: Survey Help |
Purpose: In this section we assessed the quality of the help available for the survey. We will use this information to improve the help we provide when the survey is fielded.
Feedback: No respondent needed to use the help desk.
Changes: None. |
2. Local Agency Web Survey
Section 1: Time and Effort for Completion of Local Agency Survey |
Purpose: In this section we assessed the average amount of time and effort each respondent would spend gathering necessary information and inputting that information into the survey. We will use this information to inform revisions to reduce burden and serve as a basis for burden estimates in the OMB submission.
Feedback:
Changes:
|
Section 2: Survey Content |
Purpose: In this section we assessed whether all survey sections are clear and understandable and whether agencies are able to provide all requested information. We specifically asked respondents to identify questions that were unclear, cost categories that were imprecise, inappropriate or missing, and areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information.
Feedback:
Changes:
|
Section 3: Usability |
Purpose: In this section we assessed the level of ease for respondents to read screens and navigate the survey. We will use this information to improve the usability of the survey.
Feedback:
Changes:
|
Section 4: Survey Help |
Purpose: In this section we assessed the quality of the help available for the survey. We will use this information to improve the help we provide when the survey is fielded.
Feedback:
Changes: None. |
Combined Web Survey for State-run WIC Programs
A pretest of the combined Web survey was conducted on May 31 by phone with a representative of the state of South Dakota. The duration time for completing the combined Web survey was 45 minutes, with an estimate of about 2 hours for compiling the information needed prior to starting the survey. Using feedback from the pretest, we made revisions to the combined Web survey, including adding and revising some questions. Since the combined survey includes components of both the state and local agency Web surveys, changes affecting both those surveys were also applied to the combined web survey. The tables below describe the purpose, feedback, and subsequent changes by section to the combined web survey.
Time and Effort for Completion of Combined Web Survey |
Purpose: In this section we assessed the average amount of time and effort the respondent spent gathering necessary information and inputting that information into the survey. We will use this information to inform revisions to reduce burden and will serve as a basis for burden estimates in the OMB submission.
Feedback:
Changes:
|
Section 2: Survey Content |
Purpose: In this section we assessed whether all survey sections are clear and understandable and whether agencies are able to provide all requested information. We specifically asked respondents to identify questions that were unclear, cost categories that were imprecise, inappropriate or missing, and areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information.
Feedback:
Changes:
|
Section 3: Usability |
Purpose: In this section we assessed the level of ease for respondents to read screens and navigate the survey. We will use this information to improve the usability of the survey.
Feedback:
Changes:
|
Section 4: Survey Help |
Purpose: In this section we assessed the quality of the help available for the survey. We will use this information to improve the help we provide when the survey is fielded.
Feedback:
Changes: None. |
WIC Case Study Guides
Pretests of the state and local case study guides were conducted by Altarum May 13 through May 16, 2013 in two states – Tennessee and Maryland. All of the pretests were conducted via phone interviews. The average duration of the state case study guide was 105 minutes while the average duration of the local case study guide was 65 minutes.
Using feedback from the pretests, we made revisions to both the State and local guides, including deleting some questions, adding some questions, and revising some questions. The tables below describe the purpose, feedback, and subsequent changes by section to the state and local case study guides.
WIC State Agency Case Study Guide
Introduction / Interview Procedures |
Purpose: Following the introduction and interview procedures, we assessed how easy it was for respondents to understand the descriptions and procedures provided.
Feedback: The introduction and procedures were clear and straightforward and the open-ended questions worked well.
Changes: None. |
Section 1: Interview Content |
Purpose: We assessed whether all interview sections and questions were clear and understandable and whether agencies are able to respond to the questions. We specifically asked respondents to identify questions that were unclear, inappropriate or missing as well as areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information. |
State Agency Structure |
Feedback:
Changes:
|
Budgeting and Cost Allocation Methods |
Feedback:
Changes:
|
Factors Influencing Cost of WIC Program |
Feedback:
Changes:
|
Sources of Funding and Factors Impacting Funding Levels |
Feedback:
Changes:
|
Special Cost Factors Influencing Specific Program Areas |
Feedback:
Changes:
|
Relationships with Other Programs and Their Impact on WIC Costs |
Feedback:
Changes:
|
WIC Local Agency Case Study Guide
Introduction / Interview Procedures |
Purpose: Following the introduction and interview procedures, we assessed how easy it was for respondents to understand the descriptions and procedures provided.
Feedback: None.
Changes: None. |
Section 1: Interview Content |
Purpose: We assessed whether all interview sections and questions were clear and understandable and whether agencies are able to respond to the questions. We specifically asked respondents to identify questions that were unclear, inappropriate or missing as well as areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information. |
Organization of Local Agency |
Feedback:
Changes:
|
Services Provided and Staffing |
Feedback:
Changes:
|
Cost and Funding Factors |
Feedback:
Changes:
|
Budgeting Policies |
Feedback:
Changes:
|
Factors that have Influenced the Overall Costs of the WIC Program |
Feedback:
Changes:
|
Relationships with Other Programs and Their Impact on WIC Costs |
Feedback: None.
Changes:
|
SNAP/TANF Case Study Guides
The SNAP/TANF case study guides were pretested on May 31, 2013 with the SNAP Policy Director of the Office of Family Independence in the state of Maine Department of Human Services. The case study guides were sent in advance, and an in-person interview was conducted in the Altarum office in Portland, Maine. The amount of time estimated to complete the case study interview was one hour, thirty minutes.
Introduction / Interview Procedures |
Purpose: Following the introduction and interview procedures, we assessed how easy it was for respondents to understand the descriptions and procedures provided.
Feedback: The introduction and procedures were clear and straightforward and the open-ended questions worked well.
Changes: None. |
Section 1: Interview Content |
Purpose: We assessed whether all interview sections and questions were clear and understandable and whether agencies are able to respond to the questions. We specifically asked respondents to identify questions that were unclear, inappropriate or missing as well as areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information. |
SNAP/TANF State Agency Organizational Structure and Staffing |
Feedback:
Changes:
|
Sources of Funding |
Feedback:
Changes:
|
Overall Budget and Administrative Costs |
Feedback:
Changes:
|
Factors Influencing Program Costs |
Feedback:
Changes:
|
Cost Allocation Methods |
Feedback: None.
Changes: None. |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Tameka A Owens |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |