App D1_Memo on Pretest Results

App D1_Memo on Pretest Results.docx

WIC Nutrition Services and Administration Costs Study

App D1_Memo on Pretest Results

OMB: 0584-0589

Document [docx]
Download: docx | pdf









Appendix D1


Memo on Pretest Results





The purpose of this memo is to summarize findings from pretests conducted for the various instruments that will be used for the WIC Nutrition Services and Administration (NSA Cost Study). The pretests were conducted between May 7 and May 31 of 2013, and involved both State and local WIC staff. State and local agencies participating in the pretest were matched with the type of instrument most appropriate for their type of state or local agency. The instruments included in the pretest were the State and local agency Web surveys; the combined Web survey for State-run local programs; State and local WIC case study guides; and the SNAP/TANF case study guide.


Each of the pretest respondents was provided with either a copy of the instrument (case study guide) or a Web link by which they could access the Web-based survey. They were asked to review the questions for both comprehension and ability to answer the questions (using FFY 2012 data). In addition, they were encouraged to discuss the level of difficulty or ease of obtaining the information needed for answering the questions and to estimate the amount of time required to collect the information. Information from the pretest was combined with input provided by the Peer Advisory Panel (PAP) to create the final instruments that will be submitted to FNS. The list of State and local agencies participating in the pretest include the following:


State and Local WIC Agencies Used For Pretest

State Web Survey

Kansas and Minnesota

Local Web Survey

Arizona, Kansas, and California

Combined Web Survey

South Dakota

State and Local Case Study Guides

Maryland and Tennessee

SNAP TANF Case Study Guide

Maine


This memorandum is organized into five sections. The first section will discuss the high level findings from both the PAP input and the pretest, particularly those related to the ability of a State or local WIC agency to answer the Web survey questions, the format and flow of the instruments, and the overall organization of the case study guides. The second section discusses specific pretest input for the State and local agency Web survey, while the third section discusses input on the “combined” Web survey. The fourth section discusses findings related to the case study guides for State and local WIC agencies, and the final section discusses the input provided on the SNAP/TANF case study guide.


Section I: High Level Findings


  1. Ability of local WIC agencies to complete the local agency Web-based surveys and answer all questions.

Overall, the PAP and the pretest respondents felt that the information being asked could be provided, with some notable exceptions. The greatest challenge was faced by local agencies when trying to provide the required budget detail by the four cost categories required for the 798-A report. The local agencies, and state agencies administering local programs, pointed out that there are multiple methods approved by FNS for calculating the distribution of costs across the four categories on the 798-A form. Many of the respondents reported that they use a point-in-time time study to calculate a percentage of all program personnel costs across the four categories, the percentage of which is then applied to their total labor expenditures for the year. For example, local agencies using this method would require staff time studies for a one month period, determine the percentage of staff time devoted to each of the four categories, and then report the dollar value for each of the four categories using this percentage. This provides the State with an aggregate distribution of expenditures across the four categories that can be used for closeout reporting. Other local agencies use continuous time reporting, so that the expenditures are tracked by the four categories all year. A third method noted was that some of the local agencies conduct time studies (in this case, one third of all local agencies conduct time studies each year) and that percentage is applied to all local agencies’ bottom line expenditures.


Separate from this calculation of aggregate costs by the four categories, local agencies also provide their State Agencies with detailed contract or grant closeout expenditures by budget category. These contract or grant expenditure reports list, such details as type of staff; amount billed during the year; detailed breakout of other direct costs itemized by budget type; and total indirect costs. The Web survey for local agencies asks for this level of detailed expenditures but also asked for them to further delineate the detailed information into the four 798-A categories. Since these detailed expenditures are not tracked by the local agency across the 798-A categories (they are rolled up through the time study allocation),local agencies are unable to distribute detailed individual staff or other costs across the four categories other than to apply the same percentage derived from the time study.


Since Altarum Institute (Altarum) will be collecting the 798-A backup data for local agencies from each of the State Agencies, we will have the data necessary to calculate the percentage of all program costs by the four 798-A categories. While we still need to capture detailed expenditure information, it is unnecessary to ask the local agencies to provide the 798-A breakout of these detailed expenditures via the Web survey since we will already have obtained the same information they would use to do such. As a result, by eliminating the four categories on the detailed expenditure tables in the Web survey, the burden for local agency staff is significantly reduced. Altarum will calculate the distribution across the four 798-A categories for all funds expended in FFY2013 and reported in the Web survey breakdown of program costs.


  1. Organization of the case study guides needed improvements to create a better flow and reduce burden.

The case study guides required more time to administer than we had originally believed to be necessary. The major reasons for this expanded time being needed were: 1) the organization of the questions did not lend themselves to an even flow of discussion, thus creating some confusion on the part of the respondents; 2) some of the information was available from other sources that could be obtained in advance of the interview; it was determined that some questions need not be asked during the interview, but rather collected prior to the site visit; and 3) we found that a lack of clarity for some questions resulted in our spending more time explaining the intent of the questions rather than discussing the answers provided.


This time factor and lack of clarity has been resolved in three ways. First, we have utilized the feedback from the pretests and will ask for some information in advance, such as organizational charts and geographic location of local agencies. Second, we are able to eliminate the need for several questions as there was redundancy in some of the questions we ask in the State and local Web surveys. Third, to resolve clarity and organizational problems; we will reorganize the case study guides by grouping questions better into logical topic groupings that make more sense to the respondents. In addition, the purpose of the question will be clearly stated prior to the discussion—so that less time will be spent explaining what each question is looking for in the way of response—and will allow us to keep the discussion focused on a particular topic. This will improve the flow, reduce burden, and cut down on the time needed to conduct the interview to about one hour.


  1. Publicizing the study prior to FFY 2013 closeout, and helping identify in advance the data sources needed for the study, will help prepare the State and local agencies for participating in the study.

It was made very clear to us by all the respondents that, while most of the questions could be answered, it took a great deal of time on the part of both the State and the local agencies to figure out which data were needed and where they would need to look to find it. This is because some of the questions would naturally be answered by accounting staff while other questions would be answered by the program director or their staff. The respondents requested the following be considered:


  • Initial publicity around this study should go to the State and local agencies around FFY 2013 closeout (October 2013) so that they are all aware that information being collected for closeout will be needed for this study, and that they should plan to keep their closeout files and documentation well organized and available to be able to easily find information needed for the study.


  • So State and local agencies are well prepared to respond, Altarum should provide examples of the type of information that will be needed and where State and local agencies might obtain that information prior to the study data collection period. Altarum believes that these are reasonable requests, and will improve the response rate and reduce burden. We will therefore propose to distribute the first detailed publicity about this study to the States in and around October 1, 2013. This publicity will include the introduction letter and the study brochure, to help prepare the State and local agencies.




  1. The SNAP/TANF case studies will require only a single case study guide, but will likely have to involve multiple respondents.

The questions being asked in the SNAP/TANF case study guides were reported as being straightforward and not difficult for respondents to answer. The pretest respondents indicated that much of the information was likely available and the questions were easily understood. In addition, it was noted that the same questions that would be asked of the State Office regarding the split between policy, MIS, benefit compliance and service delivery functions would also be appropriate to ask at the county level (in county-run programs), so two instruments were unnecessary.


The most significant problem with the SNAP/TANF case study guides is the number of respondents needed to answer all the questions. SNAP and TANF programs are organized much differently than WIC programs. First, there is the overall method of how program administration is organized, either a completely State-run program with districts or regions, or a county-run program where the State acts as the policy and compliance oversight agency and all local services are administered by counties. Second, there is much greater separation of duties. For example, SNAP and TANF policy offices are separate entities from entities responsible for local service delivery with staff responsible for each often located in different offices. Additionally, some SNAP and TANF functions related to policy and special programs are run as separate programs, local services are more integrated (since most states have joint applications and certification processes). The distinction between the programs at the local service delivery level is blurred, in that a person applying for services is often eligible for multiple programs, and he/she is served by a single caseworker who is knowledgeable about all programs and services provided.


Finally, budgeting and accounting activities for these programs may be handled by a completely separate organization that processes payments, conducts letter of credit draw downs, completes financial reporting for federal agencies, and prepares cost allocation plans to divide up budgets between programs.


To address this issue, it was recommended that, once the case study states are identified, and prior to scheduling visits, Altarum contact the State-level person responsible for policy for the programs, provide them with copies of the case study guide, and discuss the purpose of the case studies and who best will respond to the different questions. An alternative to this approach is to contact the FNS Regional Offices or the Agency for Children and Families (ACF) Regional Office to ask them to help identify the best person to contact in each state. Once a State-level person is identified for initial contact, the State-level person can then identify a point-of-contact (themselves or another person) who will be responsible for identifying and coordinating respondents. Then, a complete list of respondents can be developed and, with the help of the point-of contact, individual interviews can be scheduled. The point-of-contact can help to identify which sections of the case study guide will be asked of each person being interviewed. If a local county office is to be visited, the state point-of-contact can help identify a local point-of-contact to provide the same role. This approach will reduce burden and make the best use of everyone’s time.


While some states may require only a few respondents, others may be diversified enough to require multiple respondents. A preliminary list of the individuals that might be needed to complete the case study questions in the most diversified State agencies included:


  • The persons responsible for SNAP and TANF policy development and communication;


  • The person responsible for all local certification and participant management responsibilities;


  • The person responsible for the MIS, EBT, and management reporting systems;


  • The person responsible for fraud detection and control; and


  • The person responsible for adjunct programs, such as employment and training, child care, and transitional services.


To reduce the amount of time needed to conduct the case studies, some of the information that was initially proposed will be eliminated (as being unnecessary to answer the research questions). In addition, much of the time was used to capture features of the SNAP/TANF programs that can be obtained either through a checklist administered when the visit is scheduled or through an e-mail exchange with program administrators. Therefore, we will eliminate questions related to program features and replace the questions with a pre-visit check list. This will allow the interviewer to spend more time asking about programs that are in place and narrow the scope of the interview prior to making the visit.


  1. State Agency and Local Agency Web Survey

Pretests of the state and local web surveys were conducted May 7 through May 17, 2013 using telephone interviews. The following tables provide information for each section of the Web survey, including the purpose of the pretest questions, the feedback obtained, and the nature of the changes made to the instruments.


1. State Agency Web Survey


Section 1: Time and Effort for Completion of State Agency Survey

Purpose: In this section we assessed the average amount of time and effort each respondent would spend gathering necessary information and inputting that information into the survey. We will use this information to inform revisions to reduce burden and will serve as a basis for burden estimates in the OMB submission.


Feedback:

  • Entering data will be quick, but gathering the data will be time consuming, especially since respondents may not initially know where to look for the information.

Changes:

  • In the introductory material, we will provide guidance on the importance of maintaining FFY 2013 closeout information for study purposes, including a description of documentation and reports respondents will likely need in order to enter information.

Section 2: Survey Content

Purpose: In this section we assessed whether all survey sections are clear and understandable and whether agencies are able to provide all requested information. We specifically asked respondents to identify questions that were unclear, cost categories that were imprecise, inappropriate or missing, and areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information.


Feedback:

  • There are situations where staff members cross two or more functional categories listed, making it difficult to allocate salaries and FTEs.

  • It can be difficult to determine exact allocations across the 798A cost categories, because they are not necessarily reported that way for contract or grant closeout.

  • In-kind contributions to WIC can be difficult for directors to determine.

  • Agencies cannot itemize indirect costs.

  • There are some questions where the wording can be improved.

Changes:

  • We will improve the help pop-up text for each staff category to help respondents determine the staff to be assigned to each category.

  • We will reword instructional text to make clear that respondents primarily need to provide their best estimate of allocation across cost categories.

  • We will improve instructional and help text to clarify the definition of in-kind contributions and make clear that respondents need to provide their best estimates of the in-kind contributions.

  • Improve wording to some questions, based on suggestions.


Section 3: Usability

Purpose: In this section we assessed the ease for respondents to read screens and navigate the survey. We will use this information to improve the usability of the survey.


Feedback:

  • The functions of the save and confirm buttons were not immediately clear.

  • It was inconvenient to have to scroll back to the top of the screen in order to navigate to the next screen after confirming data entries.

  • Question numbering was inconsistent.


Changes:

  • Add help text explaining the use of the save and confirm buttons.

  • We will give additional consideration to having the confirm button take respondents to the next screen to improve ease of navigation between screens.

  • Make question numbering consistent within a screen. We must make this work with the show/hide functionality.

Section 4: Survey Help

Purpose: In this section we assessed the quality of the help available for the survey. We will use this information to improve the help we provide when the survey is fielded.


Feedback: No respondent needed to use the help desk.


Changes: None.


2. Local Agency Web Survey


Section 1: Time and Effort for Completion of Local Agency Survey

Purpose: In this section we assessed the average amount of time and effort each respondent would spend gathering necessary information and inputting that information into the survey. We will use this information to inform revisions to reduce burden and serve as a basis for burden estimates in the OMB submission.


Feedback:

  • Entering data will be quick, but gathering the data will be time consuming, especially since respondents may not initially know where to look for the information.

  • It took them longest to complete the indirect cost and in-kind screens.

Changes:

  • In the introductory material, we will provide guidance on the importance of maintaining FFY 2013 closeout information for study purposes, including a description of documentation and reports respondents will likely need in order to enter information.

  • We will highlight the link to the survey user’s manual so that respondents are more likely to examine that in advance. Respondents will be able to print out the manual and the survey, if necessary.

  • We will provide clearer instructions for screens, such as indirect costs and in-kind contributions, in which we are requesting the respondent’s best estimate of some of the harder to report costs.

Section 2: Survey Content

Purpose: In this section we assessed whether all survey sections are clear and understandable and whether agencies are able to provide all requested information. We specifically asked respondents to identify questions that were unclear, cost categories that were imprecise, inappropriate or missing, and areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information.


Feedback:

  • Respondents did not clearly understand that we wanted information only on their NSA grant funds.

  • Local agencies frequently don’t track costs by the 798-A categories.

  • Indirect rates are capped by the State but county governments can make up the difference to create a higher effective indirect rate.


Changes:

  • We will improve instructions to make clear that we are examining NSA grant funds and clearly define what that means.

  • We will eliminate fields that ask respondents to allocate costs by 798-A categories.

  • We will add a question on the indirect screen asking if there are indirect costs made up by the county and if so what that rate is.

Section 3: Usability

Purpose: In this section we assessed the level of ease for respondents to read screens and navigate the survey. We will use this information to improve the usability of the survey.


Feedback:

  • The functions of the save and confirm buttons were not immediately clear.


Changes:

  • Add help text explaining the use of the save and confirm buttons.

Section 4: Survey Help

Purpose: In this section we assessed the quality of the help available for the survey. We will use this information to improve the help we provide when the survey is fielded.


Feedback:

  • Respondents felt that the help pop-ups within the survey were very helpful.


Changes: None.


  1. Combined Web Survey for State-run WIC Programs

A pretest of the combined Web survey was conducted on May 31 by phone with a representative of the state of South Dakota. The duration time for completing the combined Web survey was 45 minutes, with an estimate of about 2 hours for compiling the information needed prior to starting the survey. Using feedback from the pretest, we made revisions to the combined Web survey, including adding and revising some questions. Since the combined survey includes components of both the state and local agency Web surveys, changes affecting both those surveys were also applied to the combined web survey. The tables below describe the purpose, feedback, and subsequent changes by section to the combined web survey.


Time and Effort for Completion of Combined Web Survey

Purpose: In this section we assessed the average amount of time and effort the respondent spent gathering necessary information and inputting that information into the survey. We will use this information to inform revisions to reduce burden and will serve as a basis for burden estimates in the OMB submission.


Feedback:

  • Entering data will be quick, but gathering the data will be time consuming, especially since respondents may not initially know where to look for the information.

  • It took them longest to complete the indirect cost and in-kind screens.

Changes:

  • In the introductory material, we will provide guidance on the importance of maintaining FFY 2013 closeout information for study purposes, including a description of documentation and reports respondents will likely need in order to enter information.

  • We will highlight the link to the survey user’s manual so that respondents are more likely to examine that in advance. We will provide clearer instructions for screen, such as indirect costs and in-kind contributions, in which we are requesting the respondents’ best estimate of some of the harder to report costs.


Section 2: Survey Content

Purpose: In this section we assessed whether all survey sections are clear and understandable and whether agencies are able to provide all requested information. We specifically asked respondents to identify questions that were unclear, cost categories that were imprecise, inappropriate or missing, and areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information.


Feedback:

  • The state survey questions were, for the most part, easy to understand. The questions related to breastfeeding rates and their impact on costs was unclear. Introductions to some State survey questions needed clarification.

  • For the State Agency Costing Tool, State-level costs across 798-A categories can be reported but detailed expenditures for local level costs cannot, other than to use the same information that was used to create the 798-A distributions. Travel should be split out from materials and services. Indirect costs that help support WIC program activities can be provided in a checklist, but values cannot be determined.

  • The local survey questions were easy to understand and complete.

  • The optional question was not in the survey.

Changes:

  • On the Program Demographics screen, we modified the infant formula rebate question to clarify the response choices related to infant breastfeeding rates and the resultant impact on the amount of infant formula cans purchased.

  • On the Changes in Costs screen, we modified the instructions to make it more clear what information around cost drivers was being collected.

  • On the Changes in Costs screen, we modified the wording of the question related to implementation of EBT and the impact on costs and issuance of food instruments.

  • Across all the screens in the costing tool, we modified the tables asking for data across categories to reflect the four 798-A categories: Program Management, Client Services, Nutrition Education, and Breastfeeding.

  • On the Labor/Personnel screen, we removed the 798-A cost categories from all tables with the exception of the State Functions table.

  • On the Materials, Services, and Travel screen, we modified the instructions for the question related to methods used to distribute shared costs, requesting that the respondent “check all that apply.”

  • On the Indirect Costs screen, we modified the question related to the types of costs included in indirect costs to include a checklist of features for which WIC receives benefits and the total adjusted indirect costs fields were removed.

  • On the In-Kind Contributions screen, we removed the distinction between labor and non-labor costs and added a checklist. Also, we added a question on whether the agency is able to estimate the total dollar amount of in-kind contributions.

  • We added an optional screen with an open-ended question for describing agency cost reduction strategies.


Section 3: Usability

Purpose: In this section we assessed the level of ease for respondents to read screens and navigate the survey. We will use this information to improve the usability of the survey.


Feedback:

  • The functions of the save and confirm buttons were not immediately clear.


Changes:

  • We added help text explaining the use of the save and confirm buttons.

Section 4: Survey Help

Purpose: In this section we assessed the quality of the help available for the survey. We will use this information to improve the help we provide when the survey is fielded.


Feedback:

  • Respondents felt that the help pop-ups within the survey were very helpful.


Changes: None.



  1. WIC Case Study Guides

Pretests of the state and local case study guides were conducted by Altarum May 13 through May 16, 2013 in two states – Tennessee and Maryland. All of the pretests were conducted via phone interviews. The average duration of the state case study guide was 105 minutes while the average duration of the local case study guide was 65 minutes.


Using feedback from the pretests, we made revisions to both the State and local guides, including deleting some questions, adding some questions, and revising some questions. The tables below describe the purpose, feedback, and subsequent changes by section to the state and local case study guides.



  1. WIC State Agency Case Study Guide


Introduction / Interview Procedures

Purpose: Following the introduction and interview procedures, we assessed how easy it was for respondents to understand the descriptions and procedures provided.


Feedback: The introduction and procedures were clear and straightforward and the open-ended questions worked well.


Changes: None.

Section 1: Interview Content

Purpose: We assessed whether all interview sections and questions were clear and understandable and whether agencies are able to respond to the questions. We specifically asked respondents to identify questions that were unclear, inappropriate or missing as well as areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information.

State Agency Structure

Feedback:

  • Unclear what the term “core services” meant and how to answer Question 1 regarding organizational structure by functional area or service delivery units.

  • Interviewees needed probing for the Question 4 regarding expenditures associated with contracted services.

Changes:

  • Combined and clarified Questions 1-2 on organizational structure and shared staffing from other programs.

  • Dropped Question 4 related to contracted services.

Budgeting and Cost Allocation Methods

Feedback:

  • Took interviewee a long amount of time to consider before answering Question 6 on local agencies providing direct services.

  • Question 10 related to the current federal funding formula required clarification.

  • Unsure how to respond regarding how county programs are receiving in-kind contributions in Question 15, which asks how in-kind contributions are negotiated and accounted for.

  • Unsure how to respond regarding overhead percentage in Question 16 which asks whether the State takes any overhead percentage off the top of the WIC grant.

  • Need to research indirect cost rate further before being able to answer Question 17 which asks if there is a policy limiting indirect cost charges by the sponsoring organization of local agencies.

Changes:

  • Expanded Question 5 on cost drivers to include functions and examples.

  • Restructured Question 6 on local agencies providing direct services so that it would be appropriate to the type of organization, while focusing on decisions of how service sites are selected.

  • Combined Question 7 related to the line-item budget from local agencies with Question 6 on local agencies providing direct services.

  • Regrouped Question 9 on budgeting separately for local services with other questions on structure/type of agency; deleted Question 10 related to the current federal funding formula.

  • Reworded Question 12 on access to special infant formula.

  • Added examples of in-kind contributions to Question 15.

  • Consolidated Question 16 since overhead percentage was asked in the web survey.

  • Reworded and deleted parts of Question 17 on policies limiting indirect cost charges.

Factors Influencing Cost of WIC Program

Feedback:

  • Question numbering from Section 2 to Section 3 is off.

  • Unsure if breastfeeding peer counseling was included in Question 22 on changes to funding of core functions of State program operations; suggest breaking out Question 22 into two separate questions about increases/decreases to funding.

  • Subparts of Question 24 on spending NSA funds were not indented.

  • There was a long pause before answering Question 26 on the 798-A report and it was suggested that breastfeeding be separated out into another question.

Changes:

  • Reworded Question 22 on the core functions to flow better.

  • Reworded Question 23 on moving funds between local agencies to discuss state budgets before local budgets.

  • Indented subparts of Question 24.

  • Deleted Question 26 subpart if asked on the web survey; deleted subpart of Question 27 that asked about prioritizing funds when grant levels increase.

Sources of Funding and Factors Impacting Funding Levels

Feedback:

  • Unsure how to answer Question 28 on developing budgets for core functions.

  • Respondents had a difficult time answering Question 29 on who has input on funding amount for program areas.

  • Question 30 on changes to total funding available seems out of place.

  • Respondents were unsure how to answer Question 31 on economic benefits related to agency size.

Changes:

  • Combined and condensed Questions 28-29 on developing budgets.

  • Reworded Question 30 to discuss state/local operations before non-NSA funds; deleted Question 31.

Special Cost Factors Influencing Specific Program Areas

Feedback:

  • Unclear how to answer Question 34 on estimating benefit from rebate monies; suggest providing a format to calculate the answer on Question 34.

Changes:

  • Reworded Question 34; reworded Question 35 to address whether the rate rebate per can of infant formula changes/increases in breastfeeding rates impacted amount of rebate.

Relationships with Other Programs and Their Impact on WIC Costs

Feedback:

  • Respondents were unsure how to answer Question 37 on funding for public health services that impact on WIC services; there was a long pause before answering Question 38 related to funding for core public health services and their impact on local agency ability to provide services.

Changes:

  • Reworded Question 37; incorporated Question 38 into Question 37.

  • Deleted Question 39 on funding reduction incidents in health departments.

  • Reworded Question 40 on increases in outside public health funding to include examples.


  1. WIC Local Agency Case Study Guide



Introduction / Interview Procedures

Purpose: Following the introduction and interview procedures, we assessed how easy it was for respondents to understand the descriptions and procedures provided.


Feedback: None.


Changes: None.

Section 1: Interview Content

Purpose: We assessed whether all interview sections and questions were clear and understandable and whether agencies are able to respond to the questions. We specifically asked respondents to identify questions that were unclear, inappropriate or missing as well as areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information.

Organization of Local Agency

Feedback:

  • Suggest adding word “local” to Question 1 on fit of WIC program into overall organizational structure; had trouble answering Question 3 as written on administrative structure within the agency that serves as a governing function because question is wordy and director has multiple roles which is more of a management role not governing.

  • Some local agencies cannot prepare own participation reports in Question 5 which asks about the number of WIC participants served in each of the clinics each month.

Changes:

  • Deleted Question 2 on the organization chart by requesting a copy in advance; reworded Question 3 to ask if local agency has an administrative office.

  • Deleted Question 5 on the number of participants served.

Services Provided and Staffing

Feedback:

  • Question 6 on core functions conducted by the agency is confusing and it seems like an exact list is wanted.

  • Question 7 on services other than WIC that are provided by the agency is confusing.

Changes:

  • Deleted Question 6; consolidated and revised Question 7.

  • Deleted and reworded Question 8 related to the staff vacancy rate to ask how many positions are vacant and in what classifications.

  • Moved Question 9 on support services to the section on indirect.

Cost and Funding Factors

Feedback:

  • Question 11 on other funding sources that support WIC was misunderstood initially and would benefit from having examples listed.

  • Need follow-up to Question 13 about changes in no-show rate over last year.

Changes:

  • Changed name of subsection to Impact of Participation on Program Costs and created a new subsection entitled Shared Resources; moved Questions 10-11 to the new Shared Resources subsection; reworded and provided examples in Question 11.

  • Reworded Question 12 on participation changes; reworded Question 13 on changes to the no-show rate.

  • Added examples to Questions 14-15 on activities conducted and the amount of time necessary to contact no-shows.

  • Deleted Question 16 on the cost impact of no-shows.


Budgeting Policies

Feedback:

  • Not sure what is being asked in Question 17 on methods to the develop budget for the WIC program.

  • Not sure what is being asked in Question 18 on the process for submitting the budget.

  • Not all local agency directors may know indirect rates in Question 20, which asks what percentage of the total WIC budget is used for indirect costs.

Changes:

  • Combined Questions 17-18 related to the budget.

  • Reworded Question 19 about changes to the budget; deleted question 20.

  • Reorganized Question 21 on state-run local agencies by differentiating between state-run and local budgeting.

  • Deleted Question 22 on factors impacting budget amounts for breastfeeding and nutrition services.

Factors that have Influenced the Overall Costs of the WIC Program

Feedback:

  • Need to better define core services in Questions 30-31 related to the core functions that influence the overall cost of the program.

  • Suggest asking about impact of MIS instead of a change in the MIS system in Question 33.

  • Unsure if asking about whole state or just WIC in Question 37 on changes in State-level policy impacting program costs.

  • In Question 40, need to ask more details, define vendor monitoring, and include calculation factor for determining FTEs; consider adding question about language lines in Question 41 on extra funding to support hiring bilingual staff.

Changes:

  • Combined Questions 30-31and reworded the question to ask about services delivered instead of core functions and provided examples of key factors that would influence costs.

  • Reworded Question 33.

  • Reworded Question 34 on WIC functions that are underfunded.

  • Added examples to Question 40; added more probes to Question 41.

Relationships with Other Programs and Their Impact on WIC Costs

Feedback: None.


Changes:

  • Reworded and moved Question 42 on coordinating services with agencies outside of sponsoring agencies to new subsection Shared Resources in Section 1.

  • Section 4 was eliminated.


  1. SNAP/TANF Case Study Guides

The SNAP/TANF case study guides were pretested on May 31, 2013 with the SNAP Policy Director of the Office of Family Independence in the state of Maine Department of Human Services. The case study guides were sent in advance, and an in-person interview was conducted in the Altarum office in Portland, Maine. The amount of time estimated to complete the case study interview was one hour, thirty minutes.


Introduction / Interview Procedures

Purpose: Following the introduction and interview procedures, we assessed how easy it was for respondents to understand the descriptions and procedures provided.


Feedback: The introduction and procedures were clear and straightforward and the open-ended questions worked well.


Changes: None.

Section 1: Interview Content

Purpose: We assessed whether all interview sections and questions were clear and understandable and whether agencies are able to respond to the questions. We specifically asked respondents to identify questions that were unclear, inappropriate or missing as well as areas where they would be unable to provide the requested information. We will use this information to improve questions and guarantee that respondents will be able to provide the requested information.

SNAP/TANF State Agency Organizational Structure and Staffing

Feedback:

  • Obtain organizational chart in advance. Identify best person to answer for both programs. Eliminate Question 3 related to the joint application process since all programs use this approach.


Changes:

  • Dropped the question related to the organizational chart and will obtain this in advance.

  • Dropped question 3 about the joint application process.

  • Added a question related to how budgets are developed and who prepares them.

Sources of Funding

Feedback:

  • Need to obtain information on all sources of SNAP/TANF funding from the budget office.


Changes:

  • Will conduct a pre-interview screening to determine the best respondent. Total burden will not change; simply the persons responding may be different than the person responsible for SNAP/TANF policy or operations.

Overall Budget and Administrative Costs

Feedback:

  • Need to determine if expenditure reporting is done in the various categories provided related to program features/services; may need to be modified or customized to state program features, such as states with county-run programs as compared to state-run programs.


Changes:

  • Changed budget categories slightly to reflect current program features. Customize prior to visit through a program features checklist.

Factors Influencing Program Costs

Feedback:

  • Eliminate Question 13 related to actions taken by SNAP/TANF agencies and the resultant impact on administrative costs and create a checklist of features that is compiled prior to visit. Restructure questions to reflect checklist topics.


Changes:

  • Eliminated Question 13.

  • Will create a program features checklist prior to conducting each case study and will customize questions relative to the checklist.

Cost Allocation Methods

Feedback: None.


Changes: None.






12


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTameka A Owens
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy