0584-NEW Direct Ver Eval Study Part A

0584-NEW Direct Ver Eval Study Part A.doc

Direct Verification Evaluation Study

OMB: 0584-0546

Document [doc]
Download: doc | pdf



Direct Verification Evaluation Study



Supporting Statement for Paperwork Reduction Act Submission



Order #AG-3198-D-06-0060







July 17, 2007







Prepared for

Sheku Kamara

USDA/FNS/OANE

3101 Park Center Drive

Alexandria, VA 22302



Prepared by

Nancy Cole

Christopher Logan









Contents


Part A Justification 1

A.1 Explanation of Circumstances That Make the Collection of Information Necessary 1

A.2 How the Information Will Be Used, By Whom, and For What Purpose 3

A.3 Use of Improved Information Technology to Reduce Burden 5

A.4 Efforts to Identify and Avoid Duplication 5

A.5 Efforts to Minimize Burden on Small Businesses or Other Entities 6

A.6 Consequences if Data Collection Is Not Conducted or Is Conducted Less Frequently 6

A.7 Special Circumstances Requiring Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations 7

A.8 Federal Register Comments and Efforts to Consult with Persons Outside the Agency 8

A.9 Payments to Respondents 10

A.10 Assurance of Confidentiality 10

A.11 Questions of a Sensitive Nature 11

A.12 Estimates of Respondent Burden 12

A.13 Estimates of Other Annual Costs to Respondents 13

A.14 Estimates of Annualized Government Costs 14

A.15 Reasons for Program Changes or Adjustments 15

A.16 Time Schedule and Plans for Tabulation, Analysis, and Publication 15

A.17 Display of Expiration Date for OMB Approval 17

A.18 Exceptions to Certification Statement 17

Part B Collection of Information Employing Statistical Methods Error! Bookmark not defined.

B.1 Respondent Universe and Sampling Methods Error! Bookmark not defined.

B.2 Information on Collection Procedures Error! Bookmark not defined.

B.3 Methods to Maximize Response Rates Error! Bookmark not defined.

B.4 Tests of Procedures Error! Bookmark not defined.

B.5 Individuals Consulted on Statistical Aspects of the Design Error! Bookmark not defined.


References Error! Bookmark not defined.



Appendix A Legislative Authority for Research


Appendix B State Agency Interview Guides


Appendix C LEA Informational Brochure


Appendix D LEA Brochure for Verification Nonresponse Data Collection


Appendix E Local Education Agency (LEA) Survey


Appendix F LEA Telephone Interview Guide


Appendix G Federal Register Notice



Part A
Justification

A.1 Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

The Food and Nutrition Service (FNS) of the U.S. Department of Agriculture (USDA) is requesting clearance for data collection for the Direct Verification Evaluation Study. FNS is responsible for the development and implementation of national policy for the National School Lunch Program (NSLP) and School Breakfast Program (SBP). This includes the promulgation of regulations, monitoring State operations, review and reimbursement of State and local expenditures, and program evaluation. At the State level, administration of NSLP and SBP is the responsibility of State Child Nutrition (CN) directors. Local education agencies (LEAs) determine and verify eligibility, and provide meal benefits.


Each year, LEAs are required by law to verify a sample of applications for free and reduced-price NSLP meals. A household submits a single application for all children enrolled in the school district. The conventional method of verification is to contact households and request documentation of household income eligibility. Prior to 2004, LEAs were authorized to use data from Food Stamp (FS) and Temporary Assistance for Needy Families (TANF) programs for direct verification of FS/TANF case numbers provided on NSLP applications in lieu of income information.


The Child Nutrition and WIC Reauthorization Act of 2004 (P.L. 108-265) expanded the definition of direct verification as follows:


to verify eligibility for free or reduced price meals for approved household applications selected for verification, the local educational agency may (in accordance with criteria established by the Secretary) first obtain and use income and program participation information from a public agency administering—


  1. the food stamp program established under the Food Stamp Act of 1977 (7 U.S.C. 2011 et seq.);

  2. the food distribution program on Indian reservations established under section 4(b) of the Food Stamp Act of 1977 (7 U.S.C. 2013(b));

  3. the temporary assistance for needy families program funded under part A of title IV of the Social Security Act (42 U.S.C. 601 et seq.);

  4. the State medicaid program under title XIX of the Social Security Act (42 U.S.C. 1396 et seq.); or

  5. a similar income-tested program or other source of information, as determined by the Secretary.”


Under the policy guidance issued by FNS, information from the State Children’s Health Insurance Program (SCHIP) may be used for direct verification, provided that information on household size and income or household poverty level is used in States where the SCHIP income limit is greater than 133 percent of the Federal poverty level (SP-32-2006). In this document, the term “Medicaid” includes SCHIP.

P.L. 108-265 required an evaluation of the feasibility and effectiveness of direct verification, by June 2007, and provided for expanded direct verification as follows:


‘‘(v) EXPANDED USE OF DIRECT VERIFICATION.—If the Secretary determines that direct verification significantly decreases the portion of the verification sample that must be verified under subparagraph (G), while ensuring that adequate verification information is obtained, and can be conducted by most State agencies and local educational agencies, the Secretary may require a State agency or local educational agency to implement direct verification through 1 or more of the programs described in clause (i), as determined by the Secretary, unless the State agency or local educational agency demonstrates (under criteria established by the Secretary) that the State agency or local educational agency lacks the capacity to conduct, or is unable to implement, direct verification.


Section 105 of P.L. 108-265 appears in Appendix A.


FNS invited States to participate in a pilot implementation of direct verification using Medicaid data (DV-M) in SY2006-07. Five States volunteered and were included in the study: Indiana, Oregon, South Carolina, Tennessee, and Washington. FNS also contracted with Abt Associates Inc. to conduct interviews with State agencies, select LEAs to include in the first year pilot, analyze data from the States, and prepare a report to Congress.


The pilot study conducted in SY2006-07 was limited in scope. Indiana and Tennessee implemented DV-M on a statewide basis. Oregon and Washington implemented DV-M for a sample of LEAs. South Carolina was unable to implement because of delays in obtaining Medicaid data. Valid measures of DV-M effectiveness were obtained only from Indiana and Tennessee, because of critical data problems in other States during the pilot study.


To fully meet the Congressional mandate for an evaluation of DV-M implementation and effectiveness, FNS is requesting clearance for data collection during the 2007-08 school year to evaluate direct verification on a larger scale.


Study Objectives

In 2007-2008, the Direct Verification Evaluation Study will expand on the limited pilot study conducted in SY2006-07. FNS has recruited two additional States to participate in the study, along with the five States that participated in SY2006-07. (The Federal Register Notice was based on the expectation that eight States would participate in 2007-2008, but only two additional States were recruited).


The proposed study will accomplish three main objectives:


  1. Provide information on different methods of DV-M implementation in SY2007-08, the challenges of implementation, and the lessons learned from implementation experiences in seven States.


  1. Provide estimates of the effectiveness of DV-M in SY2007-08 in States with different Medicaid income eligibility limits. Measures of effectiveness will include:

    1. State-level estimates of the percentage of NSLP verification samples directly verified with Medicaid data

    2. State-level estimates of the following time and cost measures:

      1. the average time and dollar cost of direct verification activities at the local level;

      2. the average cost of verifying an NSLP application using conventional household verification; and

      3. the average cost saving attributable to direct verification.

    3. State-level estimates of measures of LEA perceptions regarding the direct verification process (e.g., the percentage of LEAs reporting DV-M as easy versus difficult, and useful versus not useful)


  1. Provide estimates of the impact of DV-M on households’ nonresponse to NSLP verification requests. This analysis will be based on retrospective administrative data. The evaluation contractor will match NSLP application data for SY2006-07 verification nonresponders with 2006 data obtained from State Medicaid Agencies.


A.2 Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

Information will be collected to assess methods of DV-M implementation and to provide estimates of DV-M effectiveness. To address these information needs, FNS proposes to conduct four data collection activities: interviews with State CN and Medicaid agencies; survey of LEAs; administrative data collection from LEAs and State agencies; and telephone interviews with LEAs.


The study will provide FNS with State-level estimates of DV-M effectiveness in seven States, based on data collected from a probability sample of LEAs in each State.


State Child Nutrition and Medicaid Agency Interviews

Interviews will be conducted with officials of State Child Nutrition and Medicaid agencies in each State.

  • November/December 2007 collect information about implementation execution.


Interviews will be conducted by telephone unless the extent of information to be collected makes in-person interviews necessary. This approach maximizes the flexibility to schedule interviews at times convenient for the respondents. The State agency interview guides are provided in Appendix B.


LEA and State Administrative Data

Administrative data will be collected from two separately selected samples of LEAs for two separate analyses. LEAs selected for the LEA Survey (described below) will be asked to provide copies of NSLP applications that are directly verified for SY2007-08, and documentation produced by the direct verification process (such as printouts of computer screens). These documents will allow the contractor to verify the count of directly verified students and applications.1 This data collection is described in the LEA informational brochure (to be sent to LEAs during recruitment), included in Appendix C. This data collection is also described in the LEA Survey (discussed below), included in Appendix E.


A second sample of LEAs will be asked to provide administrative data from SY2006-07. These LEAs will provide copies of NSLP applications for households sampled for verification but not responding to verification requests in SY2006-07. The contractor will match these NSLP applications to administrative data from State Food Stamp (FS) and Medicaid agencies. The matches will be based on student identifying information available on the NSLP application (including student name, age, address, and telephone number), using software for probabilistic matching. Match rates provide estimates of the percentage of verification nonresponders that may be directly verified. Tennessee and Washington will not be included in this data collection and analysis, because DV-M was successfully implemented and used by LEAs in SY2006-07.2 The brochure with instructions for the verification nonresponse data request is included in Appendix D.


For the verification nonresponse analysis, the contractor will obtain files of food stamp and Medicaid data from State agencies. The files will include child names and identifiers (corresponding to the data on the NSLP application) for children enrolled in those programs in September 2006. The Medicaid data will include family size and income data, needed to determine NSLP eligibility level. The confidentiality and security of these data will be protected by data sharing agreements.


LEA Survey

A sample of LEAs will be surveyed in each State to collect information about SY2007-08 verification activities. The survey will include two data collection forms on a total of three pages. The Direct Verification Report will collect information about the timing of verification activities, the number of applications and students in verification samples, the number of applications and students directly verified, the number of applications and students for nonresponding households, and officials’ perceptions of the direct verification process. The Time and Cost Report will collect information about staff time spent on direct verification and conventional household verification. The Time and Cost Report will also collect information about staff wages and salaries for use in computing the cost of verification.


The survey forms will be available for completion on paper or via the Web, at LEA option. LEAs are required by law to complete verification activities by November 15. The Web survey will be open for responses on November 15, and the due date for survey completion will be November 30, 2007. The LEA Survey is included in Appendix E.


LEA Telephone Interviews

Telephone interviews will be conducted with LEAs at the conclusion of verification activities to collect in-depth information about the direct verification process at the local level. Abt Associates Inc. will invite all LEAs selected for the LEA Survey to participate in telephone forums. Two forums will be scheduled for each State, and up to four LEAs will be included in each forum. (If more than eight LEAs wish to participate in a State, a third forum will be scheduled.) The forums will be conducted as round-table discussions with a moderator encouraging participation by all LEAs. The guide for the LEA telephone interviews is provided as Appendix F.


A.3 Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

The data collection methodology for this study was designed to minimize burden and allow for flexibility to meet the needs of different respondents. The LEA survey will be distributed to sampled LEAs by mail. LEAs will have the option to complete the survey on paper and return it by mail, or to login to a secure Web site and complete the survey electronically. Both the mail and Web-based versions of the survey will provide respondents the opportunity to explain situations that do not fit into predefined response categories.


The contractor will enter data from the mail surveys into the Web-based system so that all responses are maintained in a single database. The Web-based survey will be programmed to check for missing data, incorrect skip patterns, and inconsistent responses.


LEAs sampled for the verification nonresponse will have two options for submitting the requested NSLP applications. One option will be to submit paper photocopies of applications. The other option will be to submit a printout or data file from an automated application database. LEAs will be able to choose the option that is easiest for them.


A.4 Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

This study does not duplicate prior research. Information does not currently exist to satisfy the data requirements of FNS regarding the methods of DV-M implementation and DV-M effectiveness.


One prior FNS study collected information about direct verification practices, Data Matching in the National School Lunch Program (Cole and Logan, 2007a), although the main focus of that study was data matching for direct certification. Through a survey of State CN Agencies conducted in fall of 2005, that study found that at least some school districts in 43 States directly verified NSLP applications containing FS/TANF case numbers, but use of computerized data matching for direct verification was rare. Only four States had an automated system for school districts to verify FS/TANF case numbers, and two additional States provided FS/TANF data to school districts for direct verification. Direct verification with Medicaid data was authorized in July 2004 and was not in use at the time of data collection for the Data Matching study.


FNS recruited States to conduct direct verification pilot tests in SY2006-07. Five States volunteered for these pilots, but only two States successfully implemented DV-M in SY2006-07: Tennessee and Washington. It is not possible to generalize results from two States to the nation as a whole. In addition, results from Tennessee represent a lower bound for the potential effectiveness of DV-M, because Tennessee has the lowest Medicaid income eligibility limit for school-age children among all States (100 percent of the federal poverty level).3 The limited information obtained from two States does not satisfy FNS data needs with respect to the mandates of P.L. 108-265.


A small number of administrative data items collected by the Direct Verification Evaluation will duplicate another data collection. The study will ask LEAs to report (a) the number of applications and students sampled for verification, (b) the number of applications and students directly verified, and (c) the number of applications and students for nonresponding households. Items (a) and (c) are reported on the Verification Summary Report (VSR) (OMB No. 0584 -0026). LEAs submit the VSR to their State CN agency by March 1 of each year, and State data files (including corrected LEA data) are due to FNS by April 15. These data are not available within the study’s reporting schedule.


A.5 If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.

Local education agencies will be contacted for this study. The burden on LEAs will be minimized by use of short data collection forms of no more than 3 pages and the option to complete data collection via the Web.


A.6 Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

The data collection for the proposed study will be conducted only one time. If the data are not collected, FNS will not have the information it needs to address the study objectives and research questions outlined in Section A.2. For example, FNS will not be able to assess the effectiveness of NSLP direct verification, and FNS will not be able to disseminate information about DV-M implementation to assist agencies that do not currently use DV-M.




A.7 Explain any special circumstances that would cause an information collection to be conducted in a manner:

  • requiring respondents to report information to the agency more often than quarterly;

  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • requiring respondents to submit more than an original and two copies of any document;

  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

  • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

  • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


There are no special circumstances requiring the collection of information in a manner inconsistent with the following components of Section 1320.5(d)(2) of the Code of Federal Regulations:


  • No study respondent will be required to report information to the agency more often than quarterly;

  • No study respondent will be required to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • No study respondent will be required to submit more than an original and two copies of any document;

  • No study respondent will be required to retain any study-specific records for more than three years;

  • No collection of information associated with the study will use a statistical data classification that has not been reviewed and approved by OMB;

  • Pledges of confidentiality will apply to personal opinions, salary information, and administrative records of information collected under assurances of confidentiality. The pledges of confidentiality will be consistent with the Congressional mandate for conducting the study, and supported by the data security protections described in Section A.10.

  • No collection of information associated with the study will require respondents to submit proprietary, trade secret, or other confidential information.


With regard to a statistical survey and the generalizability of its results {1320.5(d)(2)(v)}, the LEA survey will be conducted with a statistical sample of LEAs in each State. The survey is designed to produce valid and reliable results for each State.


For the proposed LEA telephone interviews, the selection of LEAs will be purposive rather than random. Data from interviews with these respondents will not be treated as statistically representative or generalizable to the universe of LEAs in each State. The interviews will provide examples of local experience with DV-M and enhance understanding of the statistical data.


A.8 If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden. Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported. Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.

The Federal Register Notice regarding this study was published in the Federal Register, May 8, 2007 (Vol. 72, No. 88), and specified a 60-day comment period ending July 9, 2007.


Federal Register Comments

One comment was received in response to the Federal Register Notice. That comment, included below, supports NSLP verification efforts and did not lead to any study revisions:


all methods of finding out the resources of the applicant must be used.

we have a whole bunch of cheats and freeloaders trying to exist on

american taxpayers these days. they sneak across our borders and apply

for every freebie going, free health care, free education, free welfare,

free school lunches.


it is clear that those who should be getting the free should be entitled

and all methods of verificiation should be used. we need to deport from

this country every single illegal criminal immigrant who sneaked across

our border flouting our laws. they are invaders.they in fact show

contempt for our laws. they need to be found out and deported to their

own countries.”



Consultations with Individuals Outside the Agency

Consultations on the research design, data needs, survey content, and survey protocol have occurred. The purposes of such consultations were: to ensure the soundness of the technical approach, to avoid collection of information being gathered elsewhere or already available, to reduce survey burden, and to match data collection with agency and program needs.


Individuals who contributed to these consultations include employees of the contractor, Abt Associates Inc. They include:


Nancy Cole, Ph.D.

Project Director

Abt Associates Inc.

55 Wheeler St.

Cambridge, MA 02138

Phone: (617) 349-2820

E-mail: [email protected]


Christopher Logan

Abt Associates Inc.

55 Wheeler St.

Cambridge, MA 02138

Phone: (617) 349-2821

E-mail: [email protected]

David Hoaglin, Ph.D.

Abt Associates Inc.

55 Wheeler St.

Cambridge, MA 02138

Phone: (617) 349-2814

E-mail: [email protected]



Materials about the study and the draft survey instruments were reviewed and approved by the Abt Associates Inc. Institutional Review Board (IRB). The Abt IRB contact person is:


Marianne Beauregard

Abt Associates Inc.

55 Wheeler St.

Cambridge, MA 02138

Phone: (617) 349-2852


Materials about the study and the draft survey instruments were sent to FNS for review and comment. The FNS project officer for this study is:


Sheku Kamara, Ph.D.

Office of Analysis, Nutrition and Evaluation

3101 Park Center Dr.

Alexandria, VA 22302

Phone: (703) 305-2130


During the first year of the pilot study, the potential availability of desired information was discussed with study liaisons from State CN Programs. These discussions were used to refine the data collection plans and clarify the terminology and phrasing of survey questions. The LEA survey instruments were used by State agencies for data collection during the first year of the pilot study. The persons who reviewed and commented on instruments include:


John Todd

Financial Manager/Systems Analyst

Indiana Department of Education

151 West Ohio Street

Indianapolis, IN 46204-1905

Phone: (317) 232-0865

E-mail: [email protected]


George C. Sneller, Director

Child Nutrition Services

Office of Superintendent of Public Instruction

234 8th Avenue

Olympia, WA 98504

Phone: (360) 725-6200

E-mail: [email protected]

Vivian B. Pilant, Ph.D., R.D.

Director, Office of School Food Services and Nutrition

South Carolina Dept. of Education

1429 Senate Street

Columbia, SC 29201

Phone: (803) 734-8195

E-mail: [email protected]

Sarah White, State Director

School Nutrition Program

Tennessee Dept. of Education

1240 Foster Avenue

Nashville, TN 37243-0389

Phone: (615) 532-4714

E-mail: [email protected]



Ms. Joyce Dougherty, Director

Child Nutrition & Food Distribution

State Department of Education

Public Services Building

255 Capitol Street NE

Salem, OR 97310-0203

Telephone: 503-947-5888

E-mail: [email protected]




A.9 Explain any decision to provide any payment or gift to respondents, other than reenumeration of contractors or grantees.

Financial incentives will not be offered to respondents to the State interviews, LEAs that complete the survey and administrative data collection, or LEAs recruited for telephone interviews.


A.10 Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

The interviews and survey will collect two kinds of confidential data: opinions of LEA personnel, and salaries of State and LEA personnel. In addition, the study will collect confidential administrative data consisting of copies of NSLP applications that are directly verified in SY2007-08 (sample #1) and copies of NSLP applications for nonresponding households in SY2006-07 (sample #2).


The individuals and LEAs participating in this study will be assured that the information they provide will not be released in a form that identifies them. No identifying information will be attached to any reports or data supplied to USDA or any other researchers.


Abt Associates Inc. has extensive experience in data collection efforts requiring strict procedures for maintaining the confidentiality, security, and integrity of data reported by State and school district personnel. The following data handling and reporting procedures will be employed to maintain the privacy of survey participants and composite electronic files.


  • All project staff, both permanent and temporary, will be required to sign a confidentiality and non-disclosure agreement. In this agreement, staff pledge to maintain the confidentiality of all information collected (including memoranda, manuals and questionnaires) from the respondents and will not disclose it to anyone other than authorized representatives of the evaluation.

  • LEAs will be provided prepaid labels and envelopes to ship confidential material to the central office by Federal Express. Regular mail is not used to ship any material containing confidential information.

  • LEAs will be assured in writing that the confidential information contained on NSLP applications will be maintained in locked facilities and will be destroyed at the conclusion of the study. This assurance is contained in the study brochures that will be sent to LEAs with instructions for administrative data transmittal. (The brochures for samples #1 and #2 are included in Appendices C and D, respectively.)

  • LEA respondents will be assured that they will not be personally identified in any study publications.

  • State administrative records of food stamp and Medicaid recipients will be shipped by secure and traceable means, and safeguarded at Abt Associates in accordance with protocols to be established in data sharing agreements with the participating States.

  • Once in the central office, documents containing confidential information are kept in locked file cabinets. At the close of the study, such documents are shredded.

  • Any respondent-identifying information will be contained only in a master list to be created and protected in secure storage, to which only a limited number of project staff pledged to maintain confidentiality will have access.


In addition, the evaluation contractor has established a number of procedures to ensure the confidentiality and security of electronic data in its offices during data collection and processing period. Standard backup procedures for the central office computer system protect project data from user error or disk or other system failure. Backups and inactive files are maintained on tape or compact disks. The system servers are maintained inside a secure locked area accessible only to authorized systems personnel. Files will be accessible only by authorized personnel who have been provided project logons and passwords. Access to any of the study files (active, backup, or inactive) on any network multi-user system will be under the central control of the database manager. The database manager will ensure that network partitions used for the study are appropriately protected from access by unauthorized users (by password access, decryption, and protected or hidden directory partitioning). All organizations using data on study participants will maintain security, virus, and firewall technology to monitor for any unauthorized access attempts and any other security breaches.



A.11 Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

No questions that are part of this study deal with sensitive topics.


A.12 Provide estimates of the hour burden of the collection of information. The statement should:

  • Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

  • If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.

  • Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 13.


The study will collect information through interviews with officials of State Child Nutrition Programs and State Medicaid Agencies in seven States; a survey of LEAs in seven States; administrative data collection from LEAs in seven States; and telephone forums with LEAs in seven States. Exhibit A-1 presents the burden estimates.


Exhibit A-1


Estimates of Respondent Burden



No. of respondents

No. of responses per respondent

Hours per response

Total hours

State Child Nutrition Agency Initial Interview

7

1

1.50

10.5

State Child Nutrition Agency Follow-up Interview

7

1

2.50

17.5

State Medicaid Agency Interview

7

1

1.25

8.75

State Medicaid Agency Follow-up Interview

7

1

1.25

8.75

Local Ed. Agency Administrative Data Collection

350

1

0.50

175

Local Education Agency Survey

205

1

0.50

102.5

Local Education Agency Interview

56

1

1.00

56

Total

364



379


A.13 Provide an estimate for the total annual cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

  • The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

  • If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

  • Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


There are no capital or start-up costs, and no ongoing operation and maintenance costs associated with collecting the information for this study. Other than their time to participate, there are no direct monetary costs to respondents.


A.14 Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.

Estimated costs for this collection of information include costs of: a) federal government employee staff time, and b) contractor services for designing the data collection, collecting data, analyzing data, and reporting on the data collection.


Federal Employee Staff Time

The planning of the evaluation involving Federal government staff time started in March 2006. The entire study is expected to last until June 30, 2008, culminating in a final report. This amounts to 10 months (44 weeks) in calendar year 2006, 12 months (39 weeks) in calendar year 2007, and 6 months (26 weeks) in calendar year 2008. The key Federal staff member responsible for supervising the study contractor is the Project Officer (Social Science Research Analyst – GS 13), who will spend, on average, 8 hours a week (0.20 FTE) on the project. Limited consultation is expected from the FNS Child Nutrition Division program staff at an average of 2 hours a week (0.05 FTE) from March 2006 through the study’s duration. For the duration of the study, the cost to the Federal Government is $20,680 in CY 2006 and $18,330 in CY 2007, for a total cost of $39,010. The annualized costs to the Federal Government throughout the duration of the study are shown on Table 3.


Contractor Services

The estimated cost to the federal government of the LEA data collection is $390,000. This includes the cost of developing the survey instruments, providing the instruments to States for use in the first year of the pilot, providing supporting materials for OMB clearance, administering the survey, processing the administrative data, and preparing the final report.


The estimated cost to the federal government of the State and LEA interviews is $175,000. This includes the cost of the developing the topic guides, conducting interviews in the first year of the pilot and testing the topic guides, conducting the interviews in the second year, and preparing the final report.


A.15 Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.

This is a new collection. As such, all of the costs itemized in Items 13 and 14 represent program changes. The reason for these changes is the passage of Public Law 108-265, which mandated that the USDA collect this information.



A.16 For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

This section describes the schedule for the project, along with plans for tabulation, analysis, and publication of study results.


Study Schedule

Exhibit A-2 lists the schedule for the entire project, including beginning and ending dates for the collection of information, completion of reports, and publication dates. The study began in June 2006 and is expected to conclude in June 2008, for a total study period of 24 months.


This OMB request is for data collection activities spanning a period of six months. LEA administrative data for SY2006-07 and SY2007-08 will be collected in October and November 2007; the LEA survey will be administered in November 2007; and State and LEA interviews are scheduled for December 2007 and January 2008. Clearance for this information collection is requested for one year.



Exhibit A-2


Schedule for Data Collection Activities and Project Deliverables


Activity

Time Schedule

Assist five States in assessing implementation plans and evaluating results of first year of DV-M implementation

June to December 2006

Prepare Direct Verification Pilot Study: First Year Report

January to April 2007

Prepare Federal Register notice (Appendix F)

Published May 8, 2007

Revise data collection plan and instruments

April to May 2007

Conduct State Agency interviews

May to June 2007

Sample LEAs for data collection

June 2007

Submit information clearance package to OMB

July 2007

Collect administrative data

October to December 2007

Conduct LEA survey

November to December 2007

Conduct LEA telephone interviews

December 2007

Conduct final State Agency telephone interviews

December 2007 to January 2008

Analyze survey data and prepare tabulations

January to February 2008

Prepare final report

  • Deliver first draft of final report

  • Deliver second draft of final report

  • Deliver final report

February to April 2008

March 28, 2008

May 9,2008

June 15, 2008

Deliver briefing

June 2008

Prepare and submit data and documentation

June 30, 2008


Analysis Plans

The main objectives of the study are (1) to increase understanding of the methods of DV-M implementation; (2) to provide estimates of the percentage of NSLP verification samples that may be directly verified with data from State Medicaid Agencies, (3) to provide estimates of the time and cost of direct verification at the local level, (4) to increase understanding of local-level acceptance of DV-M, and (5) to provide estimates of the impact of DV-M on nonresponse to verification requests.


Statistical analysis will be conducted on the LEA surveys, and descriptive statistics will be presented in graphs and tabular format. Summary tables and graphs will show the percentages of LEAs using DV-M in each State, the percentages of applications directly verified in each State, the percentages of nonresponding households for which applications can be directly verified, and LEA perceptions of direct verification (was it “easy” and “useful” (on a scale of 1 to 5), and will they use it again?). Tables and graphs will be accompanied by clear, non-technical discussions of their interpretations.


Information from interviews will be presented as case studies, presented in narrative form to describe the development and implementation of direct verification systems. This information will be synthesized with the survey results to frame the study’s conclusions.


Publication of Study Results

The study’s findings will be presented in three products:

  • Final report for the project, including a stand-alone executive summary describing study objectives, research approach, and major findings.

  • Pamphlet, prepared for distribution to State Child Nutrition and Medicaid agencies. The pamphlet will be 8 to 10 pages and contain the same information as the stand-along executive summary, formatted with a pamphlet layout and printed as a color glossy publication.

  • Journal article, presenting the findings of the study, and prepared for an academic audience.


A.17 If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


This approval is not being sought. The OMB approval number and expiration date will be displayed on all data collection instruments. A statement identifying the public reporting burden associated with each request will be included on each data collection instrument.



A.18 Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions," of OMB Form 83-I. B. Collections of Information Employing Statistical Methods The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results.

There will be no exception to the certification statement identified in Item 19 of Form 83-1.

1 LEAs may directly verify all students listed on an application if they match any students on the application to Medicaid data. However, LEAs may count only those students who were actually matched to Medicaid data. Administrative data are being collected to ensure accurate counts of students directly verified.

2 Although Washington did not implement DV-M statewide in 2006-2007, a substantial number of large LEAs used DV-M and thus would be ineligible for this data collection. As a result, the pool of LEAs eligible for sampling is not representative of all LEAs in the State. While Indiana made DV-M available statewide in 2006-2007, LEA participation was low enough that a sufficient pool of eligible LEAs is available.

3 In 19 States, the Title XIX Medicaid income limit for school-aged children is 100 percent of the Federal poverty level, but the income limit for the separate SCHIP program is 140 percent or greater. If these States used only Medicaid data and not SCHIP data for direct verification, the expected results would be similar to those from Tennessee.


Abt Associates Inc. Table of Contents 0

File Typeapplication/msword
File TitleAbt Single-Sided Body Template
Authorpaxsons
Last Modified ByAdministrator
File Modified2007-09-04
File Created2007-09-04

© 2024 OMB.report | Privacy Policy