RFA (Face to Face Interviews)

RFA (Alternatives to F2F) 06_16_11.doc

Uniform Grant Application for Non-Entitlement Discretionary Grants

RFA (Face to Face Interviews)

OMB: 0584-0512

Document [doc]
Download: doc | pdf


U.S. Department of Agriculture

Food and Nutrition Service




Supplemental Nutrition Assistance Program (SNAP)



Request for Applications (RFA)

Fiscal Year: 2011



Assessment of Alternatives to Face-to-Face Interviews in SNAP




CFDA #: 10.588

OMB CONTROL #: 0584-0512

EXPIRATION DATE: 09/30/2012

PUBLIC BURDEN STATEMENT:


An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a valid OMB control number. The OMB control number for this project is 0584-0512. Public reporting burden for this collection of information is estimated to be 10 hours per response. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to: U.S. Department of Agriculture, Food and Nutrition Services, Office of Research and Analysis, 3101 Park Center Drive, Room 1014, Alexandria, VA, 22302, ATTN: Rosemarie Downer



Notice to Submit Application Due: August 9, 2011

Application Due: August 31, 2011

Awards Announced and Start Date: September 30, 2011


Contents



INTRODUCTION


The Food and Nutrition Service (FNS) is sponsoring a study to examine the impact of eliminating the required client interview for SNAP benefit certification and recertification. These grants will be awarded under authority provided by the Food and Nutrition Act of 2008 (as amended through P.L. 111-296), Section 17(a)(1), 7 U.S.C. 202(a)(1). Many states have waivers that have allowed them to replace the face-to-face client interview with alternative means of screening SNAP applicants for benefits. These alternatives include telephone interview, postponed face-to-face interviews, and no interview. Little research has been conducted to quantify the impacts of replacing the face-to-face interview with alternatives.


FNS will provide grants through cooperative agreements to three states to participate in an important study that is designed to compare the effects of not conducting interviews at certification or recertification with the states’ current interviewing practices on program operations and client outcomes. Each state will identify a pilot site and a comparison site. The pilot site will eliminate the interview at certification and recertification for all SNAP clients. The comparison site will administer the program as is typically done in the state—this may be face-to-face interviews, telephone interviews, same-day interviews, or some combination.


The states will work with a FNS evaluation contractor, Mathematica Policy Research, who will evaluate whether, and to what extent, the absence of client interviews affects participation, efficiency, access, payment accuracy, administrative costs, customer access, and staff and client satisfaction. The results of this study will provide FNS with guidance as they develop guidelines to help states further improve program operating efficiency and access.


FNS invites state agencies1 to submit applications to participate in the study for developing a pilot site to test the effectiveness of a no interview model. The performance period for this grant is 28 months - October 2011 through January 2014, with pilots operating between September 2012 and November 2013. FNS will provide grants up to $150,000 to each state for participation in the study. The grants are meant to off-set the time and cost of developing and implementing the pilots, and collecting the data needed for the evaluation.


Each state should discuss in detail how they will design, implement, and administer the no interview model. The applications must demonstrate the capacity to design and conduct the pilot. The applications must show that the state is able to provide all of the data necessary for the evaluation and that the state will fully cooperate with FNS and the evaluator.


States interested in applying to participate in this study should submit a Notice of Intent to Submit an Application by August 9, 2011. States that do not submit a Notice of Intent to Submit are not restricted from submitting an application. Complete applications are due no later than August 31, 2011. FNS expects to award the grants to selected state agencies by September 30, 2011.



BACKGROUND INFORMATION


States have recently implemented bold reforms that change the way clients enroll in SNAP. A central feature of the reforms is the waiver of the mandatory face-to-face interview, which allows states to process SNAP applications and recertifications more efficiently. For states that are restructuring their case intake and management procedures through modernization, the waiver is a key tool for reorganizing local offices, centralizing case-processing functions, and potentially reducing administrative costs.


Currently, FNS allows 40 states to waive the face-to-face interview for clients at initial certification and recertification; another 7 states waive the face-to-face interview at recertification only. All 47 of these states have replaced the face-to-face interview with a telephone interview. FNS also has granted waivers to four states to postpone the expedited service interview. Expedited service applications can therefore be processed quickly and efficiently without compromising the states’ ability to meet standards on the timely processing of applications. Also, FNS has granted waivers to three states to eliminate the interview requirement altogether for households composed entirely of elderly or disabled individuals with no earned income. Finally, 20 states run Combined Application Projects (CAP) demonstrations, which allow them to automatically enroll certain SSI recipients into SNAP using relaxed requirements—including waiving interviews.


As more states replace the face-to-face interview with other interview methods or eliminate the interview for some clients, it raises important questions for FNS. States request a waiver in large part because it allows them to process applications more efficiently, which reduces administrative costs and allows them to distribute benefits to clients more quickly. At the very least some clients prefer the alternative screening processes because it means they may apply for benefits without visiting a local office.2 Yet the face-to-face interview has traditionally served several purposes. It is designed to ensure the most accurate household information is collected from clients in the application and recertification processes. Moreover, clients can receive enhanced assistance during the face-to-face interview, and caseworkers can help clients navigate the application. This ensures that clients receive the maximum deductions to which they are entitled and are referred to other programs that can serve their needs.


Little research has been conducted to determine if eliminating the interview is associated with increased administrative efficiency, decreased accuracy in the benefit calculation, or lower customer service for clients. As FNS makes policy decisions, it has an interest in better understanding how eliminating the interview might affect these outcomes. Therefore, this study will examine how eliminating the interview affects administration of the SNAP program and client outcomes.

ELIGIBLE APPLICANTS


All state agencies are welcome to apply. In county-administered states, if a particular county is interested in participating, they must receive support from the state agency, and the application must be submitted by the state agency.



PILOT DESIGN


Selected states are expected to identify one no interview pilot site (note that the pilot site can include multiple offices, metro areas, counties, or substate regions) and one comparison site that administers the “business as usual” model—the interview policies that are typical for the majority of the state. States may choose up to half their state to implement a no interview waiver. However, because of the number of SNAP applicants and participants that would be affected by a no interview waiver, large states may not be permitted to implement it in half their state. The two sites should have similar characteristics and be similar in size. FNS will select three states to participate in the study. The selection criteria will focus on the state’s capacity to successfully implement the pilot and meet evaluation requirements. The pilots will need to be designed to test the feasibility and effectiveness of the no interview model.


No interview pilot site: States will waive the interview for all clients at the pilot site at both certification and recertification. However, any client who specifically asks for a face-to-face interview must receive one. Currently there are no waivers approved that waive interviews for all types of clients. However, three states have waivers to exempt certain types of households—elderly or disabled with no earnings—from recertification interviews. Under these waivers, no interview is required for benefit determination, but if the application is to be denied, the state must attempt to schedule an interview before denial.


Note that FNS is not prescribing how states should design and implement the no interview model; however, there are several considerations (discussed below in the Application Instructions and Evaluation Criteria sections) that states must address in their design plan. FNS will assess the designs on a case-by-case basis to determine if the model complies with regulations and assures that states can verify client information and deliver benefits without an interview. States must balance the need to collect all of the necessary information for benefit determination with the least amount of client contact.


Business as usual comparison site: States are required to designate a comparison site for the evaluation of the no interview pilot. The business as usual model should represent the interview method that is typical for most clients across the state. This model may include face-to-face interviews, telephone interviews, or some combination. Unless a state has a waiver, the face-to-face interview is the current requirement for conducting interviews in each state. The majority of states do, however, have a waiver of the face-to-face interview at either initial certification or recertification. These waivers allow states to use the telephone to conduct interviews. Many, but not all, of these waivers are statewide. States may also implement some combination of face-to-face and telephone interviews. States should explain their business as usual model in detail in the application.


States are encouraged to first identify sites that are similar (see “Criteria for Selecting Pilot Sites” below) and then assign the no interview model randomly to the sites, thereby helping to ensure that site selection is not based on a site’s expected outcome.


Alternative Pilot Design. FNS will also consider a random assignment approach for states with systems that support a more rigorous evaluation. Under this design, the state would not identify pilot and comparison sites, but would instead assign interviews to clients randomly at intake and recertification. The total number of clients randomly assigned to the no interview condition shall not exceed 50 percent of the State’s current caseload. The total number that would be randomly assigned would most likely be well below 50 percent of the state’s current caseload and the actual number to be assigned shall be determined in consultation with FNS. FNS has a preference for ensuring at least one participating state can conduct such a design. To support this design, the state must have, at a minimum:

  • a unified client intake system for the entire state (or for a region with a large portion of the caseload)

  • an intake system that is capable (with some additional programming) of assigning clients to the interview and no-interview groups randomly at intake and recertification

  • the ability to track in both their eligibility system and participant case records which clients were not required to have a face-to-face interview

  • an existing business model or organizational structure in place such that a large decline in the number of client interviews across the state will not cause major staffing changes or disruption within local offices3


States interested in applying for the alternate pilot design must address these criteria in their application. They may apply for one of the designs or both, but states that wish to be considered for both designs must clearly identify each design and describe their approach for each on in detail.


FNS recognizes that most states’ intake procedures cannot support the alternative design. For example, a state that does not use a centralized intake system for all clients would not be able to randomly assign clients statewide as they apply. These states are strongly encouraged to submit an application for the pilot/comparison site design only. States that submit applications for the alternative pilot design but do not meet the criteria described above will not be automatically considered for the pilot/comparison site design, unless they apply for both.


Criteria for Selecting the Pilot and Comparison Sites

As states select their pilot and comparison sites, they must choose sites that are as similar as possible across various social, economic, and policy characteristics. If states do not identify the specific pilot and comparison sites in their applications, they must agree to and demonstrate the ability to comply with the criteria listed below when choosing the sites. The criteria that should be consistent across sites include:


  • Characteristics of the population. Several population characteristics should be similar across sites, including the percentage of working families, elderly, and children, as well as the percentage living below the poverty level.

  • Population density. The sites must have similar population densities; for instance, an urban area may not be compared to a rural area.

  • Population Size. Regardless of how the state defines a site, the number of offices, counties, or regions included should be comparable between the pilot and comparison sites. They also should serve a similar number of clients. For instance states will not be allowed to include all but one county in the pilot and use the non-pilot site as a comparison. The pilot and comparison sites must be balanced.

  • SNAP participation trends. Sites should demonstrate similar patterns of caseload growth in the period leading up to the pilot.

  • SNAP advocacy and outreach. The level of SNAP and nutrition-related advocacy and outreach in the community should be comparable across sites.

  • Level of modernization. If some modernization efforts (including policies, administrative organization, technology, and partnering) are not statewide, the state must ensure that all areas have the same modernization efforts in place and that no new modernization efforts are likely during the study period.

  • Economic indicators. The sites should have similar unemployment rates and similar business and industry presence. The states should avoid selecting areas with extensive plant closings or departures of large industries (unless consistent across all sites).

  • Waivers and demonstrations. If the state has implemented any interview-related waivers (for example, waiving the face-to-face interview at application and/or recertification, delaying the interview for expedited cases, waiving the requirement to schedule interviews), they must apply equally to both the pilot and comparison sites to allow for random selection of the sites. In addition, if the state is currently involved in SNAP demonstration projects (e.g., CAPs, the Extra Help study, the Elderly Working Poor study, the Summer Food Benefit for Children study), they should consider how inclusion of these demonstration project sites in either the pilot or comparison areas would impact the study. FNS’ preference is that the states do not include sites in multiple demonstrations, but if the state makes a compelling argument for including a site in both, it will be considered.



STATES’ ROLES AND RESPONSIBILITIES IN THE STUDY


The state agency will be responsible for planning, implementing, and operating the pilot as agreed upon by FNS and the state agency. The state also will be responsible for all aspects of the pilot, including training staff, developing new forms and documentation, and the purchase of equipment, if needed


The design and planning period for the pilot is 10 months, from September 2011 through July 2012. The pilot implementation period will begin August 2012 and will operate through October 2013. States may propose slightly different dates, but FNS requires states to operate their pilot for 15 months, and the operation timeframe must generally overlap with the specified dates. Any deviation from these dates should be justified in the application.


The state is responsible for all coordination among entities involved in the pilot. They should also identify a contact person at the state agency and pilot and comparison sites. These individuals will work closely with FNS and the evaluators throughout the project.


The state will need to cooperate with the evaluator and comply with the requirements of the evaluation. These include participating in the orientation meeting and site visits, providing extant data, and collecting program administration and QC data from the sites. States will also be required to apply for appropriate waivers and submit progress reports to FNS.

Orientation Meeting

Selected states will send representatives to a one-day orientation meeting in Alexandria, VA, in October 2011. This meeting will provide additional information to participating states regarding the study and the roles and responsibilities of FNS, Mathematica and state staff, as well as providing an opportunity for the evaluation team to begin learning more about the pilots in each state.


Site Visits

Each state will host two site visits, one in winter 2012 and the other in summer 2013. The initial visit will collect data during the operational phase of the pilots and the second will collect data at the end of the pilots. The primary goal of the visits is to collect information on the planning and implementation process for the pilot, the operation in practice, and the challenges and successes associated with the pilot. State staff will be asked to assist the evaluation team in planning and scheduling the visits. The site visits will include discussions with state staff and visits to the pilot and comparison sites.


During the second round of site visits the FNS evaluation contractor also conduct focus groups with clients receiving procedural denials to inform clients’ experience with the no-interview model. The focus groups will help determine if eliminating the interview has contributed to clients’ failure to complete the application process. One focus group will be conducted at a pilot and one at a comparison site. To identify potential participants for the focus groups, the state agencies will provide the evaluator with administrative data on SNAP applicants who submitted an application during the previous three months but were denied benefits because they failed to complete the application or recertification process.

Extant Data Collections

In addition to the primary data collected during site visits, states will be asked to provide extant data, including administrative and other state-specific data. Data extracts obtained from each state will provide valuable information for the study, including trends in SNAP participation overall and by subgroup, participation in other benefit programs, and program access as measured by the number of applications submitted and the result of the application.


Administrative Data

States will provide monthly case record extracts for the period beginning two years before the pilot implementation date to October 2013. The content of these records varies by state, but typically include the ZIP code and county of the clients’ residence; the number of members in the SNAP unit; birth dates, gender, race/ethnicity, and disability status of each unit member; the date the case was opened and last recertified; the length of the current certification period; the unit’s total gross income, net income, and earned income and the total income from Social Security and Supplemental Security Income; the amount of each deduction (such as medical and shelter); the unit’s total countable assets; whether the case received expedited service; and possibly how and where the client applied for benefits.


States should provide the data to the evaluator in two batches: one in September 2012 and the other in November 2013. Accommodations will be made for states that find it easier to deliver data on an ongoing, monthly basis.


The evaluator will also conduct short client satisfaction surveys with a few hundred SNAP recipients in the pilot and comparison sites. The surveys will take place approximately seven months after implementation of the pilot, and clients will receive $10 for completing the telephone survey. About 6 months after the pilot begins, the state will provide administrative data (including names and contact information) on households certified or recertified within the prior two months. The evaluator will draw a sample from these data and conduct the surveys over a one month period.


The evaluation contractor, Mathematica has a long history of working with confidential data and using the highest standards to secure the data. These standards will be applied to this study as well. Mathematica will protect the confidentiality of all information collected for the evaluation and will use it for research purposes only. No information that identifies any study participant will be released. Computer data files are protected with passwords and access is limited to specific users. Sensitive data are maintained on removable storage devices that are kept physically secure when not in use. Further, personally identifiable data will not be entered into the analysis file, and data records will contain a numeric identifier only. All data will be kept in secured locations, and identifiers will be destroyed as soon as they are no longer required. In addition, the evaluator will sign a data use agreement at the state’s request.

Other State Data

States will be asked to provide additional data they may collect on a range of performance-related measures. This includes monthly reports on the timeliness of application approvals by local offices; monthly reports on approval/denial rates by local office; and quarterly State Administrative Expense (SAE) reports submitted to FNS. If available, states will also provide call center usage data, data on electronic application usage, and data on community partner usage.


In addition to these performance-related measures, the evaluator may collect other relevant materials on state modernization efforts or client satisfaction surveys, if available.


Program Administration Data Collection

States will provide the evaluator with three types of data related to program administration. This data will determine the time and costs associated with the no-interview model and business as usual. Office-wide performance data from local office managers that will provide a complete and accurate view of the key activities of all staff within each pilot site, time-use interviews with staff will provide insight into the daily activities of individual SNAP staff members and identify the context in which they perform those activities, and administrative cost data will provide details about the costs associated with the no-interview model.


Office-Wide Performance Data

Using a tool developed by the evaluator, local office managers in the pilot and comparison sites will track and submit monthly statistics on key performance indicators. The monthly statistics will include data on the number of applications received, the number of interviews scheduled in each site, the number of interviews completed in each site, the number of interviews completed using a procedure other than the site’s pilot model (for example, the number of face-to-face interviews requested in the no-interview site), the number of staff dedicated to specific tasks (for example, interviewing, verifying information, determining eligibility, and so on), the average number of days needed to process applications, and the number of procedural denials.


Time Use Interviews with SNAP Staff

Local offices will participate in time use interviews. The evaluator will conduct 60-minute telephone interviews with three to four SNAP staff members at each site. The interviews will (1) help confirm the accuracy of the performance data provided by local office managers, (2) provide details about the activities performed over the course of one day and the amount of time spent on each activity, and (3) provide much-needed context for the time and effort SNAP staff spend on face-to-face interviews, telephone interviews, and all other SNAP-related activities.


Administrative Cost Data

States will provide monthly administrative cost data tied to the operations of the pilot and comparison site. The data should include salaries and benefits of SNAP staff, the need for additional computers, software, or telephone lines—associated with performing face-to-face or telephone interviews—and any other costs associated with the pilots.


QC Reviews

For the evaluation, each state shall be responsible for conducting QC reviews of households in the pilot site to determine payment accuracy. States must sample a minimum of 225 to 300 households for QC-like reviews, both pre- and post-implementation, for a total of 450 to 600 reviews.4 The sample will be taken solely from the no-interview pilot site and shall be in addition to any cases sampled for the official SNAP QC reviews required by FNS. The QC-like reviews shall be conducted using the same procedures as the official QC reviews. However, the interviews will focus on the active case sample and not the negative cases. Each round of QC-like reviews should be conducted over a two- to three-month period. The pre-implementation round of reviews will begin by May 2012 and the post-implementation round will begin by August 2013.


Waivers Required

States participating in this study must apply for an administrative waiver to implement the no-interview model. If any additional waivers are required as part of the state’s application, please specify.


Administrative waiver to implement the no-interview model must address the following:


  • The waiver applies at certification and recertification;

  • The State agency must grant an interview (telephone or face-to-face) if one is requested by the household or its authorized representative, or if the State agency deems it necessary due to outstanding issues or questions about the application;

  • The State agency must ensure that the necessary administrative staff and technological functionality is in place to implement this waiver correctly and to meet all reporting requirements;

  • No application for a household in the targeted group will be denied without an attempt to schedule an interview; and

  • The State agency will comply with all requirements of the evaluation contractor, Mathematica Policy Research, which will evaluate whether, and to what extent, the absence of client interviews affects participation, efficiency, access, payment accuracy, administrative costs, customer access, and staff and client satisfaction.


Required Recordkeeping and Reporting Requirements

Quarterly progress and financial progress reports, and a final report must be submitted to FNS. As outlined in 2 Part 225 Cost Principles For States, Local, and Indian Tribal Governments (OMB Circular A-87) quarterly progress reports must provide a description of the activities conducted during the reporting period, major accomplishments with completion dates and budget, deviations from the proposed plan, difficulties encountered, solutions developed to overcome difficulties, and major planned activities for the next quarter. These reports are due 30 days after the end of each calendar quarter. The final progress report will summarize the progress over the course of the entire grant period and is due 90 days after the end of the project.

COMPENSATION FOR STUDY PARTICPATION


State agencies will receive up to $150,000 in grant funds from FNS to offset the costs associated with participation. If other entities, such as local SNAP offices, are involved with this effort, each state will be responsible for distributing the funds to the appropriate entities. The grant is to be used for the costs associated with developing, implementing, and operating the pilots. The grants will also cover any costs associated with participating in the evaluation, including any data collection required. This includes transferring data to the evaluator and participating in multiple in-person and telephone interviews.



THE EVALUATOR’S ROLE AND RESPONSIBILITIES


Mathematica Policy Research, a FNS contractor, will conduct the evaluation for this study. They will be responsible for coordinating data collection with the states and providing guidelines and formats for specific types of data collection. They will review design plans and talk with state staff to help ensure that states implement their pilots in a way that minimizes the effects of external factors on outcomes examined in this study.


The evaluator will produce two reports on the findings from the pilots, an interim and a final report. The interim report will provide a thorough description of the interview methods and implementation experiences of the study sites. The final report will build on the interim report, providing a description of the implementation and outcomes of the pilots in the three study states. It will also include implementation experiences and responses of stakeholders. The report will consist of both within- and cross-state analyses.




STUDY TIMELINE


Table 1 provides a timeline of key study activities. Alternative dates will be considered if the state agency explains the rationale for the suggested changes and the changes do not impact the overall evaluation of the pilots.


Table 1. Study Timeline

Activity

Date



State Orientation Meeting

October 2011

First QC-like reviews

May 2012

States launch pilots

September 2012

State submits first batch of administrative data

September 2012

First round of site visits

November 2012-January 2013

Client Survey

February 2013

Caseworker time use interviews

March 2013

Second round of site visits plus focus groups

June-August 2013

Second QC-like reviews

August 2013

Interim report (final draft)

October 2013

Pilots end

November 2013

State submits second batch of administrative data

November 2013

Final report (final draft) by Evaluator

August 2014




CONFIDENTIALITY


All data provided to the evaluator by states in support of the study or by individuals participating in interviews and focus groups will be kept private and will be used only for research purposes, except as may be required by law or regulation. Although the states will be identified in study reports, data related to SNAP recipients or staff will only be publicly reported in aggregate, and no individuals will be identified in study reports. Data files containing any information on individual SNAP participants will be encrypted during transmission between states and Mathematica, and any identifying information on individuals will be replaced with randomly generated anonymous identifiers prior to analysis. Access to individual-level data will be restricted to the study team and researchers directly authorized by FNS. Data files containing identifying information will be destroyed after completion of the project.



APPLICATION INSTRUCTIONS


State agencies that intend to submit an application to conduct a pilot should submit a notice of intent by August 9, 2011. This notice does not obligate the state agency to submit an application but provides FNS with useful information in preparing for the review and selection process. The notice should include the name and address of the state agency, as well as the name, telephone number, and e-mail address of the primary contact for the application. State agencies may send by mail or e-mail this information to the grants officer, Carla Garcia at:


Carla Garcia

Grant Officer, Grants and Fiscal Policy Branch

U.S. Department of Agriculture, FNS

3101 Park Center Drive Room #732

Alexandria, VA 22302

E-mail: [email protected]


State agencies that do not submit a notice of intent to apply may still submit an application by the due date.


State agencies will have an opportunity to ask clarification questions about this RFA. Questions should be submitted in writing to the grants officer at [email protected]. All questions should be received by July 26, 2011. FNS will compile these questions and post the responses on the FNS website (http://www.fns.usda.gov/ora/) by August 2, 2011. FNS will not identify questions by the individual or individual’s agency that submitted the questions.



APPLICATION DUE DATE


The complete application package must be uploaded on www.grants.gov on or before 11:59 p.m. Eastern Daylight Time (EDT) on August 31, 2011 Applications received after the deadline date and/or time will be deemed ineligible and will not be reviewed or considered. FNS WILL NOT consider any additions or revisions to applications once they are received. FNS will not accept mailed or hand-delivered applications.



SUBMISSION OF APPLICATION


Applications should be submitted electronically through www.grants.gov.

FNS will not accept mailed or hand-delivered applications.


The government-wide website www.grants.gov is designed for electronic submission of applications. When submitting the application electronically, we advise that you allow ample time to familiarize yourself with the system's requirements. You will need both a Data Universal Number (DUNS) and a Central Contractor Registration (CCR) to access the system.  Additionally, you will need to register on Grants.gov prior to submitting your application through the system. Registering and obtaining these accesses will take several days to complete.


Applicants must send an email to Carla Garcia at [email protected] stating that the application was submitted through the grants.gov portal. This e-mail must be received no later than 11:59 p.m. Eastern Daylight Time on the application due date, which is August 31, 2011. Please be aware that the grants.gov system provides several confirmation notices; you need to be sure that you have confirmation that the application was accepted.



APPLICATION FORMAT AND REQUIREMENTS


Application Format— All applicants must adhere to the following application format. Use of this format will make it easier for grant reviewers to locate the requested information and to evaluate your application.


Required Standard Forms:


All applicants must complete the following forms:


The following grants.gov forms are required of grant applicants, which are located at http://www.grants.gov/agencies/aforms_repository_information.jsp:


  1. SF-424 (R&R)

  2. Assurance for Non-Construction Programs (SF-424b)

  3. R&R FedNonFed Budget; [Provide a budget for each funding year requested along with a budget summary]

  4. R&R Sub-award Budget Attachment(s)Form [If applicable attach the proposed sub-award budget as a PDF document for each funding year with a budget summary]

  5. Project/Performance Site Locations(s)

  6. Research & Related Senior/Key Personnel [attach résumé or curriculum vitae of key personnel]

  7. Research and Related Senior/Key Person Profile (Expanded) [Not required, use when needed]

  8. HHS Checklist (08-2007) [E.O. 12372, only applicable to participating states]



The following OMB form is required, which is located at: http://www.whitehouse.gov/sites/default/files/omb/grants/sflllin.pdf:

SF LLL (Disclosure of Lobbying Activities): Indicate on the form whether your organization intends to conduct lobbying activities. If your organization does not intend to lobby, write “Not Applicable.”



USDA Grant Certification Forms: The following USDA forms are located on the following website http://www.ocio.usda.gov/forms/ocio_forms.html


  1. AD-1047 Certification Regarding Debarment, Suspension, and Other Responsibility Matters;

  2. AD-1048 Certification Regarding Debarment, Suspension, Ineligibility and Voluntary Exclusion-Lower Tier Covered Transaction (Must submit with application only if a Sole Source Contractor is identified); and

  3. AD-1049 Certification Regarding Drug-Free Workplace Requirements.



USE OF GRANT FUNDS


All costs must be considered as allowable, allocable, necessary, and reasonable in accordance with 2 CFR parts 225 where appropriate. Allowable use of funds includes, but is not limited to, personnel costs; office and research supplies; travel for data collection; and technology (both hardware and software) necessary for operating the Cooperative Agreement.



USDA ADMINISTRATIVE REQUIREMENT


7 CFR Part 3016 Uniform Administrative Requirements for Grants and Cooperative Agreements to State and Local Governments established the uniform administrative rules for Federal grants and cooperative agreements and subawards to State, local and Indian tribal governments.


The grant program will be awarded and administered in accordance with applicable Federal and program regulations. These include but are not limited to:


  • 7 CFR Part 3015: Uniform Federal Assistance Requirements implementing OMB directives (OMB Circular A-87, Cost Principles for State, Local and Indian Tribal Governments);

  • 7 CFR Part 3016: Uniform Federal Assistance Requirements for Grants and Cooperative Agreements to State and Local Governments;

  • 7 CFR Part 3017: Government-wide Debarment and Suspension (Non-procurement);

  • 7 CFR Part 3021: Government-wide Requirements for Drug-Free Workplace (Financial Assistance);

  • 7 CFR Part 3018: Restrictions on Lobbying; and

  • 7 CFR Part 3052: (OMB Circular A-133) Audits of States, Local Governments, and Non-Profit Organizations.

  • 7 CFR Part15: Discrimination; Civil Rights

  • 2 CFR Part 25 – Universal Identifier and Central Contractor Registration


2 CFR Part 25 – Universal Identifier and Central Contractor Registration

Effective October 1, 2010, all grant applicants must obtain a Dun and Bradstreet (D&B) Data Universal Numbering System (DUNS) number as a universal identifier for Federal financial assistance applicants, as well as active grant recipients and their direct subrecipients of a subgrant award. To request a DUNS number visit http://fedgov.dnb.com/webform


The grant recipient must register its DUNS number into the Central Contractor Registration (CCR) as the repository for standard information about applicants and recipients, and the registration must be maintain in the CCR throughout the performance period of the grant award. To register a DUNS number and or maintain a CCR registration visit www.ccr.gov. OMB requires grant recipients DUNS number registered in CCR be current in order to access (usaspending.gov) the federal prime grant recipient reporting website.


FNS may not make an award to an entity until the entity has complied with the requirements described in 2 CFR 25.200 to provide a valid DUNS number and maintain an active CCR registration with current information.



2 CFR Part 170—Reporting Subaward and Executive Compensation


As required by the Federal Funding Accountability and Transparency Act (FFATA) of 2006 (Pub. L. 109–282), as amended by section 6202 of Public Law 110–252, hereafter referred to as “the Transparency Act”, requirements for recipients' reporting of information on subawards and executive total compensation.


Prime Grant Recipients awarded a new Federal grant greater than or equal to $25,000 as of October 1, 2010 are subject to FFATA subaward reporting. The prime recipient is required to file a FFATA subaward report by the end of the month following the month in which the prime recipient awards any sub-grant greater than $25,000. The grants subaward reporting data must be entered into the Federal Subaward Reporting System (FSRS) available at www.fsrs.gov. Specific OMB award terms and conditions will be included in all grant awards.



APPLICATION TEMPLATE


FNS strongly encourages interested state agencies to adhere to the following application format. Applications should be typed on 8½ by 11 inch white paper with at least 1 inch margins on the top and bottom. All pages should be single-spaced, in 12 point font. The application should be no more than 25 pages, not including the cover sheet, table of contents, resumes, appendices, and required forms. All pages must be numbered.



Cover Sheet

The cover page should include, at a minimum:


  • The name of the state agency and mailing address

  • The primary contact’s name, job title, mailing address, phone number, and email address

  • The name of the grant

  • The period of performance


Table of Contents

  1. Application Summary (1 page limit)

The summary should highlight pilot goals and design, and provide a general timeline for the planning, implementation, operations, and completion of the pilot.



  1. Pilot Design and Implementation (20 page limit)

The technical section of the Application should:

  • Outline the goals and objectives of the pilot

  • Describe the design of the no-interview pilot and how it will be implemented and operated, specifically describing the process for staff and clients. Detail any potential issues related to implementing or operating the pilot.

  • Specify the size of the pilot and comparison site areas (i.e., offices, counties, region, etc). Describe the locations for the pilot and comparison, including community characteristics, SNAP participant characteristics, and local agency characteristics. Indicate for which factors the sites will be comparable and for which they will not, using the pilot site selection criteria outlined in the RFA. Detail any SNAP corrective action plans or outstanding issues at any of the sites. If the sites are not yet selected, describe in detail how the state plans to choose the sites and any potential issues with the options or methodology.

  • Indicate which interview-related waivers are in place throughout the state, and specify if there is inconsistency in the pilot and comparison areas. Indicate if any of the potential pilot or comparison areas are currently involved in other FNS demonstration projects.

  • Provide detailed information on the pilot’s timeline (design, implementation, operation, and conclusion of the pilot) and deliverables.

  • Describe plans for notifying clients in the pilot site about the pilot.

  • Describe any new organizational structures, staffing, policies, procedures, software (or changes), hardware, or forms needed to implement the pilot.

  • Indicate any FNS waiver requests needed and when those will be submitted for approval.

  • Explain the training plans for local agency staff, partners, and any other group involved in the pilot.

  • Describe how the state and local agencies will work with the evaluator to meet the needs of the evaluation.

  • Describe any changes to the intake or eligibility process or computer systems that are anticipated during the pilot period and their potential impact on the pilot operations



Alternative Design: If applying for the alternative design, in addition to the items above, discuss the capacity to conduct random assignment. Describe the client intake system, the size of the pilot, the capacity and process for assigning clients to the interview and no-interview groups, any system or structural modification needed to conduct random assignment, and the impact on the current business model. All other criteria should be discussed in the context of this design. [Note that states interested in being considered for both the pilot/comparison and the alternative pilot designs should fully address all criteria in the application for each design and note their preference. Only one of the pilot designs will be awarded to each state.]


  1. The application should also specifically address the following questions.

  • How would implementing the no-interview model change the overall processing of cases?

  • How will the state handle subgroups in the pilot area that are receiving a different interview method under a current waiver?

  • How will the state handle walk-in clients at the no-interview site?

  • If a client asks for a face-to-face interview at a no-interview site, how will that be tracked for data purposes?

  • Will the no-interview model change the way expedited cases are processed?

  • How will the state implement the pilot to reduce the potential of increased error rates and erroneous denials?

  • What is the level of modernization in place in the state and how does that affect the pilot site? Indicate if there will be additional changes in the pilot or comparison areas or the state during the study.

  • What is the state’s capacity for conducting QC-like reviews and their ability to provide the data required by the evaluator? How many reviews can the state conduct and over what time period? Explain any deviation from the suggested number of interviews or timeline.


  1. Staffing and Management of the Grant (4 page limit)

The application should identify all persons who are responsible for managing, developing, and administering the pilot. Please include their current position, time commitment, roles and duties in the project, and relevant experience. Also, describe any vacant positions, hiring plans, and anticipated hiring dates related to this grant. Discuss contingency plans for key personnel leaving the project or for any disruption in the implementation plan. Resumes of key personnel should be included in an appendix, along with position descriptions for vacant positions.


In particular, states should identify the primary contact at the state and at each of the local pilot and comparison sites. Their contact information should be provided. If the pilot sites have not been selected, the contact information should be provided after the selection is complete.


The state should discuss the chain of command and identify who will prepare progress reports. Also discuss the communication process between the state agency and pilot site, as well as how data will be transferred.



EVALUATION OF GRANT APPLICATION CRITERIA


FNS will prescreen all applications to ensure that they contain the required documents and information. If an application does not include all appropriate information, FNS will consider the application to be non-responsive and will eliminate it from further evaluation.


Following the initial screening process, FNS will assemble a panel to review and determine the technical merits of each application based on how it addresses the required application components. The panel will recommend grant applicants whose application best demonstrates the capacity to implement and operate the pilots, including those applications that address all of the criteria completely, explain in detail how the pilot will be designed and implemented, and show capacity and willingness to cooperate fully with the data collection and evaluation. In addition, the applications that best detail a process for collecting the necessary information from clients to verify cases with minimal contact under the no-interview model will merit higher scores. Additional points will also be given to states that are not applying for the alternate design but have the capacity to conduct more than the minimum suggested QC-like reviews at the pilot site.


The selection official will consider the panel’s recommendations. The selection official may consider other FNS priorities, such as geographic, demographic, socioeconomic diversity or size of QC sample in addition to the scores assigned by the technical review panel. FNS reserves the option to select one or more lower-rated applications in order to achieve a diversity of projects and regional representation.




Grant Scoring

Up to 100 points will be awarded to each application. Points will be awarded as follows:


Pilot Design and Implementation (70 points)

  • Application demonstrates a thorough understanding of the goals and objectives of this RFA.

  • Application explains the need for the pilot project by describing how the project will address the needs of the target population.

  • Application clearly describes the design of the pilot and how it will be implemented and operated, with particular attention to the potential issues associated with implementing the no-interview model.

  • Application contains clear and realistic timeframes to plan, implement, and operate the pilot. Defines start and end dates as well as clearly identifies milestones. Provides assurances that the pilot will launch on schedule and operate for the duration of the pilot period.

  • Application demonstrates that state agency has the technical capacity to complete the project.

  • Application shows that the proposed pilot and comparison sites are compatible as defined by the RFA. If the sites are not selected at the time of the application, applicant must demonstrate their ability to identify comparable sites in the state.

  • Application clearly expresses the state’s intention to cooperate with FNS and the evaluator. Demonstrates an understanding of the evaluation and reporting requirements. Demonstrates that the state can collect all of the required data.

  • Application identifies all likely waiver requests and when the requests will be submitted to FNS.


QC-Like Reviews (20 points)

  • Application demonstrates capacity to conduct the minimum number of QC-like reviews during the specified time periods.

  • Application demonstrates capacity to conduct additional QC-like reviews. Specify the exact number and time period required for collection.


Staffing and Management of the Grant (10 points)

  • Application clearly and thoroughly defines the roles and duties of all key positions that bear substantial responsibility for managing, developing, or administering the pilot. Provides the current position of key personnel, their prospective title during the pilot, and the percentage of their time allotted to the project.

  • Application demonstrates that key personnel have the necessary education, experience, and skills for their designated project role. Includes supporting documentation, which may include the résumé or curriculum vitae of key personnel and position descriptions for all key positions.

  • Application identifies personnel responsible for working with the evaluator.

  • Application explains how project administrators and key personnel will address challenges throughout the course of the project.

  • Application articulates how the state agency will provide the necessary oversight to ensure high quality products, services, or outcomes to keep the project on schedule. Contains a contingency plan for unforeseen obstacles.

  • Application provides a plan for managing all personnel associated with the pilot and for addressing contingencies, such as the loss of key personnel. Contains a distinct chain of command.



VII. Checklist for the Application Package

All proposals submitted under this RFA must contain the applicable elements described in this announcement, and must be submitted electronically through www.grants.gov by 11:59PM on August 31, 2011. The following checklist has been prepared to assist in ensuring that the proposal is complete and in the proper order prior to sending.



Tips for Proposal Writers


  1. Read the RFA carefully, more than once.

  2. Follow the Applicant Template

  3. Use the RFA Evaluation of Grant Application Criteria to structure your proposal correctly.

  4. Make sure budget figures are consistent between the budget form and narratives.

  5. Don’t leave out mandatory grant application forms or supporting such as resumes, budgets, ... etc

  6. Don’t assume that reviewers know anything about your organization or its work.

  7. Have one or more persons who were not involved in writing your proposal read it and give suggestions for possibly improving it.

  8. Insure that the required information is accurate and complete?

  9. Use the correct Catalog of Federal Domestic Assistance (CFDA) (CFDA # 10.588)




Application Package and Budget Narrative Checklists This checklist will assist you in completing the application make certain you include all of the necessary information to be considered for a participation grant. Please review the checklist to ensure the items below are addressed clearly. Your project description should relate directly to the priorities of the request. This checklist will also assist you in completing the budget narrative portion of the application. NOTE: The statement of work must capture the bona fide need. The budget and budget narrative must be in line with the project description. FNS reserves the right to request information not clearly addressed.


APPLICATION PACKAGE CHECKLIST

YES

NO

Cover Letter



Does the cover letter specifically address the 2011 priorities in the request?



Does the cover letter include clear statements as to the priorities being addressed?



Proposal



Does the proposal respond to the presentation criteria?



Does the proposal include all of the required components?



Mandatory Forms



  • Research & Related Family (R&R Family) forms as identified under the Application Format and Requirements section of the RFA, can be found at the following website:

http://www07.grants.gov/agencies/aforms_repository_information.jsp



Certifications



  • Anti-Lobbying Certification Form can be found at the following website: (http://www.whitehouse.gov/omb/grants/sflllin.pdf )

  • If the entity or applicant does not conduct lobbying activities, please indicate “not applicable” on the form.




  • Applicants chosen for award will be required to attest that they are not suspended or debarred and subsequently will also be required to verify that all subawardees and contractors are not suspended or debarred as well.



  • (Optional) Survey on Ensuring Equal Opportunity for Applicants

  • This survey is a tool to allow the Federal government to better understand the population of applicants for Federal funds. The survey is voluntary and seeks input from nonprofit private organizations (not including private universities).



Correct Format



Is the original application package on 81/2 x 11 inch white paper, single-sided and unstapled?



Is the type size at least 12 point and margins set to one inch on all sides?



Is the application over 25 pages, including all required attachments?


NOTE: The requirement application forms certifications, and attachment will not be counted toward the 25 page limit.



Personnel



Did you include all key employees paid for by this grant under this heading?



Are employees of the applicant’s organization identified by name and position title?



Did you reflect the current yearly salary as a percentage of time to be devoted to the

project?



Fringe Benefits



Did you include your organization’s fringe benefit amount along with the basis for the computation?



Did you list the type of fringe benefits to be covered with Federal funds?



Travel



Are travel expenses itemized? For example origination/destination points, number and purpose of trips, number of staff traveling, mode of transportation and cost of each trip.



Are the Attendee Objectives and travel justifications included in the narrative?



Is the basis for the lodging estimates identified in the budget?



Equipment



Is the need for the equipment justified in the narrative?



Are the types of equipment, unit costs, and the number of items to be purchased listed in the budget?



Is the basis for the cost per item or other basis of computation stated in the budget?



Supplies



Are the types of supplies, unit costs, and the number of items to be purchased reflected in the budget?



Is the basis for the costs per item or other basis of computation stated?



Contractual (FNS reserves the right to request information on all contractual awards and costs after the award of contract.)



Has the bona fide need been clearly identified in the project description to justify the cost for a contract or sub-grant expense(s) shown in the budget?



Has a justification for all sole-source contracts been provided in the budget narrative, prior to approving this identified cost?



Cost Allocation



If programs other than SNAP benefit from this project are costs allocated to demonstrate that the grant funds only SNAP’s share?



Other



Consultant Services: Has the bona fide need been clearly identified in the project description to justify the cost shown on the budget? The following information must be provided in the justification: description of service, the consultant’s name, and itemized list of all direct costs and fees. The cost of salaries and wages must have the number of personnel including the position title (specialty and specialized qualifications as appropriate to costs), number of estimated hours times hourly wages, and all expenses and fees directly related to the proposed services to be rendered to the project.



For all other line items listed under the “Other” heading, list all items to be covered under this heading along with the methodology on how the applicant derived the costs to be charged to the program.



Indirect Costs



Is the amount requested based upon a rate approved by a Federal Agency? If yes, is a copy of the negotiated rate agreement provided along with the application?



If no, does a negotiated indirect cost agreement exists to determine the base rate of this cost and does the application should show this cost as a direct cost to the budget?





1 States are defined as the 50 contiguous states and the District of Columbia throughout the RFA. Territories are not included.

2 Rowe, Gretchen, Sam Hall, Carolyn O’Brien, Nancy Pindus, Lauren Eyster, Robin Koralek, and Alexandra Stanczyk. “Enhanced Supplemental Nutrition Assistance Program (SNAP) Certification: SNAP Modernization Efforts, Final Report: Volume I.” Washington, DC: The Urban Institute, July 2010.



3 Although the pilot may reduce staff workload, the state will need to be able to utilize their staff efficiently during the pilot period. Also keep in mind that after the pilot ends, the state will continue to interview clients as before and need staff to transition back to that work.

4 All else equal, FNS will give preference to states that complete more QC-like reviews. See the Evaluation of Grant Application Criteria section for more detail on scoring. QC-like reviews are reviews that are conducted via telephone, instead of in person.



File Typeapplication/msword
AuthorGretchen Rowe
Last Modified ByBrenda Buescher
File Modified2011-06-16
File Created2011-06-16

© 2024 OMB.report | Privacy Policy