CACFP FDCH Pretest Evaluation

APPX G - CACFP FDCH Pretest Evaluation.doc

CACFP Improper Payments Data Collection Pilot Project

CACFP FDCH Pretest Evaluation

OMB: 0584-0549

Document [doc]
Download: doc | pdf


A PPENDIX G




MEMORANDUM P.O. Box 2393

Princeton, NJ 08543-2393

Telephone (609) 799-3535

Fax (609) 799-0005

www.mathematica-mpr.com



TO: Fred Lesnett, FNS



FROM: Rhoda Cohen, Lara Hulsey, and Barbara Harris DATE: 8/17/2007

CACFP-008


SUBJECT: CACFP FDCH Pretest Evaluation




I. introduction

The Child and Adult Care Food Program (CACFP) subsidizes nutritious meals served to children and older adults in day care facilities, emergency shelters, and after-school programs. To assist in meeting legislative requirements to estimate the annual amount of erroneous payments, the U.S. Department of Agriculture (USDA), Food and Nutrition Service (FNS) contracted with Mathematica Policy Research, Inc. (MPR) to explore the feasibility and validity of four methods for validating the meal claims served by Family Day Care Homes (FDCHs) in the CACFP. This memorandum discusses the experiences and results of a pretest of the methods and makes recommendations for the upcoming pilot.



A. Background

Administered by FNS, the CACFP plays an important role in providing children and the elderly with access to adequate food, improving the quality of day care, and making it more affordable for low-income families. The program subsidizes nutritious meals and snacks served to children and older adults in participating day care facilities and to children in emergency shelters and eligible after-school programs. The providers of care are reimbursed at a fixed rate for each qualifying meal they serve to program participants.


In recent years, there have been concerns about program oversight and accountability in the family day care component of the CACFP. The integrity of the program depends in large part on the accuracy of meal claims submitted for reimbursement and the effectiveness of procedures for verifying this information. Evidence from state and federal program reviews, audits, and investigations has suggested a significant number of inaccurate claims and other problems resulting in improper payments (U.S. Department of Agriculture 1999). This evidence has raised concerns at USDA, in Congress, and among program participants and other stakeholders.


Under the Improper Payments Information Act (IPIA) of 2002 (Public Law 107-300), USDA must identify and reduce improper or erroneous payments in various food and nutrition programs, including the CACFP. Erroneous payments in the CACFP arise when program sponsors and individual providers submit inaccurate meal counts and claims for meal and snack reimbursements. Erroneous payments can be caused by misclassification of the eligibility status of providers and the children in their care due to administrative errors or misreporting. Payment errors can also result when program sponsors claim reimbursement for unallowable administrative costs.


To comply with the requirements of the IPIA legislation, USDA will need to estimate the annual amount of erroneous payments in the CACFP to determine if the amount is significant. A full assessment of the rate of erroneous payments is a complex undertaking, because reimbursement and eligibility requirements vary for different components of the program. FNS recently initiated two CACFP program assessment projects that will inform the development of strategies for measuring (1) erroneous payments to sponsoring organizations, and (2) erroneous payments that result from misclassification of eligibility status (Macaluso 2006; Smith 2006). In addition, FNS contracted with MPR to conduct the CACFP Improper Payments Data Collection Pilot Project.



B. The Pilot Study and Research Questions

The pilot study will contribute to the estimate of erroneous payments by evaluating different methods for estimating the true number of reimbursable meals provided to participating children by CACFP FDCH providers. The true number of meals can then be compared with the number of meals claimed by providers as part of the calculation of erroneous payments. Upon the conclusion of this study, FNS will undertake a follow-up evaluation that utilizes the recommended method to produce a national estimate of erroneous payments in the CACFP, to meet the reporting requirements of the IPIA.


Specifically, the pilot study will evaluate the validity, cost, and feasibility of implementing four different data collection methods for validating meal reimbursement claims, to identify a preferred method. For each of the methods tested, the study will address the following research questions:


  • What are the strengths of the method for validating the meal reimbursement claims submitted by FDCHs?

  • What are the weaknesses of the method for validating such claims? What, if any, steps could be taken to overcome these weaknesses?

  • What is the level of confidence that the estimates of erroneous payments developed from application of the method will meet the requirements of the IPIA?

  • What is the feasibility of administering the method on a national level? Could the method be administered on a national level at this time? What factors or events must be present for the projected level of feasibility to be met?

  • What is the potential cost of implementing the data collection method nationwide?

C. The Four Methods

The methods are designed to use different approaches to estimating the true number of reimbursable meals served to children. The goal is to identify a method that permits a valid comparison between the true number of reimbursable meals served and the number claimed for reimbursement. Discrepancies between the true number and the number claimed would be indicative of an erroneous payment.


  • Method 1: Parent/Guardian Recall of Children’s Attendance. In Method 1, parents or guardians are surveyed about their recollections of their children’s attendance at the FDCH during a target week. Their child’s attendance at an FDCH is considered an indication that a reimbursable meals or snack was received by the child. In addition, the surveys asked parents which meals and snacks their child received at the FDCH each day during the target week.

  • Method 2: Sign-In/Sign-Out Logs. This approach collects data from sign-in/sign-out (SISO) logs maintained at FDCHs that track, by day, the time each child arrives at and departs from the FDCH. A child’s attendance during a mealtime will be considered an indication that the child received a reimbursable meal or snack. The attendance data will be used to generate an estimate of the number of meals served by type.

  • Method 3: Mixed Approach: Sign-In/Sign-Out Logs and Parental/Guardian Surveys. This method combines the use of parent recall with the SISO logs. This method will be used to determine if more complete and accurate attendance data can be obtained when combining the two methods.

  • Method 4: Observational Data. This approach collects observational data on the specific meals and snacks provided to children at FDCHs, along with data on the characteristics of sponsors, FDCHs, and participating children. This data will be collected for a sample of FDCHs observed across their scheduled breakfast, lunch, supper, and snack times, and used to develop statistical models to project estimates of actual meals claimed for reimbursement.

To estimate erroneous payments, the estimate of the number of meals served, by type, from the selected method, would be compared to the number of meals claimed for reimbursement by FDCHs. The goal of the current study is not to compute estimates of erroneous payments, but to evaluate the feasibility, validity, and cost of each method for measuring the number of meals served in the CACFP. The upcoming pilot test will fully assess each method. The recently completed pretest, while involving inadequate sample sizes for statistical tests, enables us to examine the feasibility and cost of each method and make some preliminary comments about their validity. Section II of this memorandum describes MPR’s experiences in collecting the four types of data, and Section III discusses findings based on the limited pretest data and makes recommendations for the pilot.



II. METHODS USED DURING PRETEST

This section describes the data collection procedures used in the pretest and MPR’s experiences implementing each method. Together with Section III, “Analysis and Recommendations,” it constitutes MPR’s proposed plans for conducting the pilot test. Subsection A reviews the methods tested in the pretest and the planned distribution of those methods across the family day care homes (FDCHs) participating in the pretest. Subsection B details MPR’s experiences in identifying pretest states, sponsors, and FDCHs, and implementing each method.



A. Data Collection Procedures

In the pretest, conducted between June 7 and July 25, 2007, MPR tested four methods, separately and in combination, to determine which have the greatest chance of success for validating meal reimbursement claims submitted by FDCHs. Each method is described above in Section I.C.


At the orientation meeting on January 12, 2007, MPR presented its plan for the pretest. The plan specified the number of FDCHs participating in the pretest and the methods to be employed with each home. At the meeting, the consensus was that MPR, in cooperation with FNS, would identify sponsors and FDCHs that used SISO logs, as well as other standard forms, to participate in the pretest. At that meeting, we suggested the distribution of methods as listed in Table II.1. This distribution of methods across the FDCHs would have resulted in parent recalls in five FDCHs, SISO logs in eight FDCHs, and on-site observations in eight FDCHs.


TABLE II.1

PLANNED DISTRIBUTION OF METHODS ACROSS FAMILY DAY CARE HOMES

Method

Description

Number of FDCHs for Testing Each Method or Combination of Methods

Method 1

Parent recalls (N = 2)

1

Method 2

SISO logs

2

Method 3

Parent recalls (N = 3) and SISO logs

2

Method 4

On-site observations

3

Methods 1 and 4

Parent recalls (N = 2) and on-site observations

1

Methods 2 and 4

SISO logs and on-site observations

3

Methods 3 and 4

Parent recalls (N = 2), SISO logs, and on-site observations

1


For the pretest, MPR selected a convenience sample of states, sponsors, and FDCHs rather than selecting a random sample. Given the restrictions on the number of respondents that we faced during the pretest phase, we purposively selected states, sponsors, and FDCHs to capture variability in the types of FDCHs and the types of data that can be used to validate meal reimbursement claims.



1. Identifying Pretest States and Sponsors

MPR selected two states, New Jersey and New York, to include in the pretest because they had sponsors using SISO logs that could be used to test Methods 2 and 3. In cooperation with our contact in each state, MPR selected one sponsor on the basis of its willingness to cooperate, location, size, and availability of data. It was important to select sponsors willing to cooperate with MPR without conditions or requirements for their participation. We recognized that the pretest was to be completed on a brief schedule, and that valuable data collection time could be lost if lengthy negotiations were needed to enlist a sponsor’s cooperation. MPR also considered the locations of the sponsor’s FDCHs in the selection process. To minimize data collection costs, especially for evaluating Method 4, MPR selected FDCHs within 50 miles of the Princeton, New Jersey office. In addition, MPR considered sponsors’ size. Sponsors with a large number of FDCHs with heavy administrative burdens might be less able to cooperate with MPR on the evaluation schedule, even if they were willing to do so.


Once a state contact had been identified by FNS staff, MPR got in touch with that person to obtain a list of CACFP sponsors. This process appeared to work well. However, it was apparent that the state contacts did not fully understand initially what types of activities would take place during the project. MPR senior project staff provided that information.


MPR had discussions with state contacts to select appropriate sponsors. After selecting the sponsors, MPR contacted each one to describe the study and request a list of FDCHs. Both sponsors responded quickly to the request and plans for the pretest seemed to be on target. MPR then provided detailed information to the designated contact person at each sponsor about the project activities. The executive officer at one sponsor needed reassurance that the data collected would not impact the sponsor and that MPR would work to minimize burden on the sponsor and the FDCHs.


Following receipt of the detailed information about project activities, the other sponsor contacted MPR with a list of six conditions for their continued cooperation. The sponsor was adamant that the conditions be met before further discussions could take place. The conditions were as follows:


  1. A letter from MPR to the sponsor describing the study and listing all the activities that would take place regarding the FDCHs, parents, and children and including the items required from the sponsor.

  2. A letter from MPR to the FDCHS providing full details about every activity and data collection effort to be carried out with the providers and the parents.

  3. A letter to all parents of children in sampled FDCHs specifying all activities from observation to parent interviews, and including a request for permission for the sponsor and/or FDCH to release contact information. The sponsor required active consent from the parents. The letter would be given to the parents in duplicate by the FDCH provider and then be signed by the parent(s) giving permission for their contact information to be provided to MPR. The FDCH would be responsible for distributing the letters and collecting signed parental consent forms.

  4. A criminal background check for each MPR staff member who would visit an FDCH.

  5. Reimbursement to the sponsor for hourly costs at the overtime rate and travel time for sponsoring agency workers who would accompany MPR observers.

  6. A “hold harmless” document from MPR for the sponsoring agency including a certificate of insurance and liability.

MPR contacted FNS and reviewed these conditions. After thorough consideration, FNS determined that MPR should respond to the requests for criminal background checks and the certificate of insurance and liability. MPR provided evidence of criminal background checks and insurance liability as requested by the sponsor. The sponsor then agreed to cooperate fully.



2. Selecting FDCHs

MPR used only two selection criteria to identify FDCHs for the pretest. First, each selected FDCH needed a sufficient number of parents in order to conduct the recall survey with up to three willing parents per FDCH for Method 1 and Method 3. Second, a sufficient number of FDCHs needed to use the SISO logs required to evaluate Methods 2 and 3.


MPR planned to combine the in-home observation with one of the three sponsor-required monitoring visits to FDCHs, which would effectively restrict the sample of eligible providers to those who had not yet been visited three times in the past year. The sponsors did not seem to be persuaded that this was an option for them and therefore did not limit MPR’s selection of FDCHs to those needing a sponsor-required monitoring visit.


To further assist in the selection of FDCHs, MPR asked sponsors which homes under their supervision used any kind of daily SISO log. Although the state contacts had indicated that the selected sponsors used SISO logs in their FDCHs, MPR later determined that such logs were not required by either sponsor. However, both sponsors identified FDCHs they believed were using SISO logs. The FDCHs were then assigned to one or more of the pretest methods.


To test each of the methods, MPR needed enrollment forms and meal claims for a three-month period. For two of the methods, SISO logs were also required. Before beginning the data collection, MPR asked the sponsors to provide copies of enrollment forms for each FDCH selected for the pretest. One sponsor provided the forms within two weeks of the initial request. The second sponsor provided the enrollment forms for some of the FDCHs within three weeks of the initial request. MPR placed reminder calls to the provider to obtain the missing enrollment forms. Obtaining enrollment forms from the sponsor was an efficient method of collecting names of children, ages, scheduled attendance and meals, and contact information for parents and guardians. One sponsor required their monitor to accompany MPR staff during all on-site visits. The second sponsor wanted to notify the FDCHs and obtain permission for MPR staff to visit the homes without a monitor from the sponsoring agency.



B. Experiences Implementing Each Method

Once FDCHs had been selected, they were assigned to one of the seven approaches being tested. MPR set a data collection schedule for each FDCH participating in the pretest, including proposed days of the week and times for observation. When one or more methods were combined, data collection involved a series of activities that had to be carried out in a specific sequence. If an activity could not be accomplished within the scheduled time frame, subsequent activities associated with that method or combination of methods could not occur as scheduled. Postponed observations seriously impacted the schedule. The MPR contact person at one sponsor had extended vacation periods and did not make arrangements for MPR to gain access to the FDCHs before leaving for vacation. Her assistant did not have the authority to make those arrangements. The other sponsor had assigned a part-time monitor to accompany MPR staff and that constrained the scheduling process. The monitor’s limited and inflexible hours resulted in many changes to the planned schedule. Table II.2 summarizes the administration of various methods that were used during the pretest. For each method, the table indicates the number of FDCHs for which SISO logs were collected and observations conducted, and the number of parents interviewed. For the combined methods, both types of data collection took place within the same FDCH. Each type of method being tested was limited to fewer than 10 respondents or observations.


TABLE II.2

PRETEST ADMINISTRATION OF FOUR METHODS

Pretest Method

Description

Number of
FDCHs Using
SISO Logs

Number of
FDCHs Being
Observed

Parent Recalls
(Telephone
Interviews)

Method 1

Parent recalls

0

0

7

Method 2

SISO logs

0

0

0

Method 3

Parent recalls and SISO logs

0

0

0

Method 4

On-site observations

0

2

0

Methods 1 and 4

Parent recalls and on-site observations

0

2

2

Methods 2 and 4

SISO logs and on-site observations

5

5

0

Methods 3 and 4

Parent recalls, SISO logs, and on-site observations

0

0

0

Total


5

9

9


With so few FDCHs using SISO logs, the distribution of methods across FDCHs had to be modified during the pretest data collection. Method 3 was not tested in any FDCHs, and Method 2 was tested only in combination with Method 4. One fewer FDCH was included in the pretest than originally planned, because of lack of SISO logs. Table II.3 shows the differences between the design used in the pretest and that in the original plan.


TABLE II.3

DISTRIBUTION OF METHODS IN ACTUAL PRETEST AND ORIGINAL PLAN

Method

Description

Number Used for Testing Each Method in Pretest

Original Number Planned for Testing Each Method

Method 1

Parent recalls

3 FDCHS with 7 parent recalls

1 FDCH with 2 parent recalls

Method 2

SISO logs

0

2 FDCHS

Method 3

Parent recalls and SISO logs

0

2 FDCHs with 3 parent recalls

Method 4

On-site observations

2 FDCHs

3 FDCHs

Methods 1 and 4

Parent recalls and on-site observations

2 FDCHs with 2 parent recalls

1 FDCH with 2 parent recalls

Methods 2 and 4

SISO logs and on-site observations

5 FDCHs

3 FDCHs

Methods 3 and 4

Parent recalls, SISO logs, and on-site observations

0

1 FDCH with 2 parent recalls

Total


12 FDCHs with 9 parent recalls

13 FDCHs with 9 parent recalls


In the following sections we review each method, noting what worked well and what did not work as planned.



1. Parent Recalls

Obtaining enrollment forms from sponsors prior to the start of pretest data collection worked well and ensured that parent contact information was available. There were no obstacles to obtaining the enrollment forms. Since the recall period was the prior week, MPR staff had limited days (Sunday through Wednesday) in which to contact parents and complete telephone interviews. MPR staff called during daytime and evening hours and received 100 percent cooperation from the parents who were contacted. MPR attempted to contact 18 parents in the process of completing the 9 interviews. On average, fewer than two attempts were made before reaching the parents. Very few parents asked the interviewer to call back at another time to complete the interview. There were no refusals and no break offs during the interviews.


The interviews were planned to last no more than 15 minutes, and ranged in time from 6 to 15 minutes (see Appendix A for the parent survey instrument). With an average time of 8.9 minutes, the interview length met our expectations. As described below in Section III, parents were able to answer the questions and there were no major problems with item nonresponse. The debriefing questions at the end of the parent interview revealed that all respondents found the questions easy to answer; they expressed confidence in their responses and indicated their willingness to respond to two to five additional minutes of questions. This suggests that a few more questions could be added to the parent interview without jeopardizing cooperation.


MPR intended the parent questionnaire to be administered without the knowledge of the FDCH provider, to avoid any possible impact of the provider on parents’ cooperation or responses. The sponsors were made aware of the potential for parent interviews, but MPR did not identify the FDCHs selected for pretesting Method 1. One parent who completed an interview became suspicious of MPR’s intent and notified the provider who, in turn, contacted the sponsor. The provider was concerned that MPR was conducting an audit and wanted the option of notifying parents prior to interviews. This experience raises the issue of what types of information should be provided to FDCHs affiliated with sponsors selected for the pilot study.



2. Sign-In/Sign-Out Logs

Although sponsors indicated many FDCHs used SISO logs, a limited number of homes actually completed them. For the pretest, MPR had expected eight FDCHs to use SISO logs and found that only five of the eight actually used them. At one sponsor, some FDCHs used SISO logs in addition to the meal claims forms signed by parents. At the other sponsor, a county-level community action program (CAP) asked FDCHs receiving a subsidy from the CAP to use some type of SISO log. The CACFP sponsor was not involved in the development, training, or monitoring of those SISO logs. At least one provider completed the SISO log as a tool to record daily attendance. Another provider used the SISO log to note when a child was absent.


MPR determined that SISO logs could only be obtained during the on-site visit because FDCHs were not required to submit them to their sponsors. The fact that so few FDCHs were using SISO logs meant that it was not possible for MPR to test Method 3, the combination of parent recalls and the SISO log. Because MPR could only obtain copies of the SISO logs when making on-site visits, MPR could only test Method 2 in combination with an on-site observation. There were several problems in obtaining SISO logs for the target week, even from FDCHs that were using the logs. In some cases, the logs were single sheets of paper and in others they were maintained in a bound book. In both cases, it was necessary to photocopy the pages. Illegibility was frequently a problem, and it was often unclear who signed the child in and out. Many parent signatures were missing. The greatest concerns were lack of uniformity of information on the logs and lack of uniformity in how the FDCH enforced their use. Observations would suggest that unless the provider reminds the parents to sign in, parents may leave the FDCH without doing so.



3. On-Site Observations

Originally, MPR planned to conduct unannounced observation visits accompanied by the sponsor. The visits were planned to occur at meal times, based on information provided by the sponsor about the approximate times of meals. MPR staff would have liked to have observed two eating occasions at each FDCH. MPR had suggested that sponsors use the visit as one of the required sponsor monitoring visits, provided the activities associated with the sponsor’s monitoring did not contaminate the pretest data collection effort. Contrary to the study’s plans, both sponsors notified the FDCHs in advance of the observation visits. There were providers from both sponsors who appeared to make extraordinary preparations for the MPR visit. These preparations included having additional staff present, equipping the children’s bathroom with special towels, decorating the space with special flowers, and providing special plates for the children’s meals.


Although one sponsor did not require the MPR staff to be accompanied, the limited availability of the monitor from the second sponsor often interfered with MPR’s ability to observe meal service and did not allow for observation of more than one eating occasion per FDCH. In most cases, the meal times that the sponsor indicated were only approximate and the FDCH provider had great flexibility in meal service times. This was another confounding factor in MPR’s ability to observe more than one eating occasion. When MPR was able to arrange visits to two FDCHs on the same day, the ability to observe more than one eating occasion at each sponsor was also impacted. Unless the two FDCHs were located within a few minutes’ drive of one another, the observers were unable to time their visits to coordinate with service times of the eating occasions.


With one exception, providers were comfortable with the presence of the MPR observers. The exception involved a non-English-speaking provider who did not fully understand what the observer was doing; the monitor was unable to communicate the purpose of the visit.


The visits were mostly observational (Appendix B); however, it was also useful for MPR observers to ask providers a few questions to better understand the context for providing meals (Appendix C). MPR planned for the interview portion of the visit to be limited to 15 minutes. The interviews ranged from 5 to 11 minutes, and averaged 8.2 minutes in length, well within the planned 15 minutes. Many providers had limited time, and this appeared to be an ideal length for the interview.


The provider interview could occur at any time during the visit. We asked providers to estimate how often (on average) they adhered to preplanned meals (as opposed to deciding closer to the day or time of the meal), and how often they typically filled out required forms (after each meal, at the end of the day, at the end of each week, or at the end of each month). At the FDCHs being observed and supplying SISO logs, observers asked questions about compliance with the form (How did they present the request to parents? Did parents remember to fill it in? How often did the provider fill in missing information?). When the monitor from the second sponsor insisted on being present during the provider interview, her presence may have influenced the provider’s responses.


The management time associated with scheduling the observation visits exceeded MPR’s expectations. Considerable time was spent arranging and rearranging schedules and observer availability to accommodate changing needs of providers and the monitor. MPR needed to train four observers to accommodate the level of flexibility that the sponsors required.


In addition to the two sponsors discussed above, MPR contacted sponsors in Colorado, Massachusetts, Kansas, and Pennsylvania to obtain hard copy administrative forms, to determine the variation in the formats and the range of information available from these sources. Seven sponsors provided copies of enrollment forms, SISO logs (when their use by the sponsor was not limited to corrective action), and meal claims forms. One sponsor used an online database to collect this information. Two other sponsors that provided hard copy forms also used an online system for some of their FDCHs.


The core data elements—child’s name, date of birth, parent’s name, and telephone number—were consistently included on all sample enrollment forms. The date elements on the SISO logs were inconsistent across sponsors. One of the sponsors on a military base provided the following comments about SISO logs:


“Sign In sheets are my pet peeve. EVERY provider everywhere should have to do this! If this were a USDA requirement it might eliminate a thing or two for the sponsorships. And add a level of professionalism to providers’ recordkeeping. Since they use time/space percentage for their taxes, they have to be able to proof (sp) who was there daily, and when. Or, if they don’t record (the) cost of food based on receipts and claim annual cost of food based on actual meals served based on attendance, the sign in sheets again document all of that.”


There were some variations across sponsors in the presentation of data elements on the meal claims forms. The provider’s name, child’s name, and meal or snack type were found on all forms. However, the total meals claimed were aggregated (daily, weekly, semimonthly, and monthly) and totals for each child by meal type were not always provided. Tables II.4-II.6 summarize the data elements included on each type of form.


TABLE II.4
DATA ELEMENTS ON CACFP ENROLLMENT FORMS


Colorado


Kansas


Massachusetts

New Jersey

New York


Pennsylvania

Data Element

Sponsor
1

Sponsor
2


Sponsor 1

Sponsor
2


Sponsor
1

Sponsor 1

Sponsor
1


Sponsor 1

Sponsor 2

Child Information













Child's first name

x

x


x

x


x

x

x


x

x

Child's last name

x

x


x

x


x

x

x


x

x

Date of birth

x

x


x

x


x

x

x


x

x

Age







x






Gender







x






Child's relationship to provider




x









Racial/ethnic identity of child

x

x


x

x




x




Parent Information













Mother's name

x

x


x

x


x


x


x

x

Address

x

x


x

x


x

x

x


x

x

Home phone - mother

x

x


x

x


x

x

x


x

x

Work phone - mother

x

x


x

x


x

x

x


x

x

Cell phone - mother

x

x





x


x




Email address

x

x











Father's name







x






Address







x






Home phone - father







x






Work phone - father







x






Cell phone - father







x






Schedule and Meal Information













Usual days in care

x

x


x

x


x

x



x

x

Usual hours in care by day

x

x





x


x


x

x

Usual hours daily




x

x



x





Usual meals to be received by day

x

x





x






Usual meals to be received daily




x

x



x

x


x

x

Schedule varies checkbox





x








Provision of meals for infant by parent

x

x





x






Provision of meals for infant by provider

x

x



x


x






Provider offers formula (brand/breastmilk)




x

x








Age at which infant ready for solid foods





x








Accept/decline solid food for infant





x








Other information













Name of provider

x

x



x


x

x

x


x

x

Provider's address








x



x

x

Provider's signature











x

x

Parent’s signature

x

x


x

x


x

x

x


x

x

Relationship to child







x






Date of signature





x


x

x





Name of other FDCH provider







x






Provider's license number

x

x











County of provider

x

x











Sponsor's name

x

x









x

x

Other siblings in care?

x

x











School age/kindergarten?

x

x











Grade in school

x

x











Hours in preschool/school

x

x



x








Days in preschool/school

x

x



x








Year-round school checkbox













Name of school/number

x

x











School district

x

x











New child/updated information check




x

x








Allows for multiple children in household




x





x




First day on menu/Initial enrollment date




x

x




x


x

x

Preferred time and method for parental contact











x

x




TABLE II.5
DATA ELEMENTS ON SIGN-IN/SIGN-OUT LOGS


Colorado


Kansas


Massachusetts

New Jersey

New York


Pennsylvania

Data Element

Sponsor
1

Sponsor
2


Sponsor
1

Sponsor
2


Sponsor
1

Sponsor
1

Sponsor
1


Sponsor
1

Sponsor
2

Child’s name

x

x






Information,
when
available,
varied from
one FDCH
to another

Information,
when
available,
varied from
one FDCH
to another




Separate page/section for each child


x









Provider’s name

x

x









Month and year


x









Time in a.m./p.m.

x

x









Time out a.m./p.m.

x

x









Parent’s initials daily


x









Parent’s signature daily

x










Parent’s signature monthly


x









No SISO




x

x


x


x

x



TABLE II.6
DATA ELEMENTS ON CACFP MEAL CLAIMS FORMS


Colorado


Kansas


Massachusetts

New Jersey

New York


Pennsylvania

Data Element

Sponsor
1

Sponsor
2


Sponsor
1

Sponsor
2


Sponsor
1

Sponsor
1

Sponsor
1


Sponsor
1

Sponsor 2

Name of form

Daily Bimonthly Meal Count Record

Monthly Meal Count Record


Child and Adult Care Food Program Menu/ Attendance Form

Menu/ Attendance


Provider Meal Attendance

2007 Child and Adult Care Food Program Family Day Care Attendance and Meal Count Record

Not Specified


Meal Participation and Attendance

Not Specified

Provider's name

x

x


x

x


x

x

x


x

x

License number

x

x






x





Month and year covered by form

x

x


x



x

x



x

x

Month and days covered by form









x




Number of days covered by form

15

31


5

7


up to 25 days

31

5


5

31

Child's name

x

x


x

x


x

x

x


x

x

Child’s age

x

x


x

x


x

x

x



x

Provider's own child?


x











Part-time or full-time participation













Specified meals - B/L/D

x

x


x

x


x

x

x


x


Specified snacks - am/pm/evening

x

x


x

x


x

x

x


x


Child's meal totals by meal type


x





x

x

x



x

Child's snack totals by snack type


x





x

x

x



x

Monthly aggregate meal totals by meal type

x

x





x

x





Monthly aggregate snack totals by snack type


x





x

x





Semimonthly aggregate meal totals by meal type












x

Semimonthly aggregate snack totals by snack type












x

Weekly aggregate meal totals by meal type




x

x




x


x


Weekly aggregate snack totals by snack type




x





x


x


Daily aggregate meal totals by meal type





x






x


Daily aggregate snack totals by snack type





x






x


Provider's signature

x

x



x



x

x


x


Date of signature

x

x



x



x

x




Changes to enrollment







x






Daily menu




x

x




x




License capacity





x




x




Tier 1 or Tier 2








x



x

x

Child's time of arrival and departure








x




x

Total number attending during period













Parent's initials









x






III. ANALYSIS AND RECOMMENDATIONS


The full pilot will assess the feasibility, validity, and cost of different methods of estimating the number of meals served by the CACFP. Before proceeding with the pilot, however, we conducted a pretest to identify issues in the data collection methods and determine which methods have the greatest chances of success. The pretest analysis is intended to give a more qualitative than quantitative review of each method. The pretest includes no more than nine data points for each type of data collection, which provides a limited base from which to assess of the potential strengths and weaknesses of a method. In addition, sponsors and FDCHs for the pretest were carefully selected on the basis of location, cooperation, and their ability to work on the pretest schedule, so the pretest participants are not statistically representative of all sponsors or FDCH providers. In particular, it is possible that the pretest participants are more organized and cooperative than the average sponsor or FDCH.



A. Perceived Strengths and Weaknesses of Each Method

Data Quality

Although the sample sizes are insufficient to conduct formal statistical tests, the pretest facilitates qualitative analyses to assess the quality of data collected through each tested method. In this section, we first address the extent to which parents, providers, or sponsors could consistently provide data for each method; we then provide summary statistics and tabulations for data collected. Finally, we make preliminary comments on the validity of each method, where possible, by comparing data collected using each method to data from direct observations.



1. Summary of Missing Data for Each Method

The amount of missing data found in the pretest is one indicator of the level of difficulty to expect in collecting similar data in the pilot and an eventual larger evaluation. While small amounts of missing data are not a great concern, analyses based on the remaining data may be unreliable if a high percentage of data is missing. The cases with missing data may be systematically different from those without missing data. There are two different types of missing data: (1) unit nonresponse, in which all data is missing for a case (such as a parent interview that was not conducted, a SISO log that could not be collected, or a meal observation that did not take place); and (2) item nonresponse, in which some data was collected for the case, but specific items were missing (such as a parent unable to answer certain interview questions, or an incomplete SISO log).


Unit Nonresponse. The pretest relied on purposive samples for each type of data collection, and the goal was to complete a certain number of cases, rather than to reach a target response rate, so response rates were not tracked. For each method, there were some cases in which we attempted to collect data but did not succeed. The obstacles encountered in the parent interviews and observations can be overcome by persistence, but those met in collecting the logs may be insurmountable. For the parent interviews, there were cases in which we attempted to survey parents and did not reach them or find a convenient time to schedule an interview, but no parent refused to participate. We originally had planned to observe two meals at each selected FDCH, but completed only one observation for most homes due to scheduling difficulties.


Missing SISO logs were a more serious issue. Because not all states or sponsors require FDCHs to maintain such logs, we knew from the beginning that they would not be available for every home. However, even in cases where sponsors reported that FDCHs kept SISO logs, we sometimes found that they did not, or that the document they were calling a log did not meet minimal requirements for our data collection. As a result of missing logs, we could not collect the data needed for Method 3 for either of the FDCHs in which we intended to test that method.


Item Nonresponse. In addition to cases for which we were unable to collect data at all, some cases were missing certain items. Item missings were minimal for most methods but more problematic in the SISO logs data.


  • Parent Interviews: Although all parents who completed interviews were able to answer questions about their child’s daily attendance and receipt of most meals during the target week, two of the nine parents were unable to recall some specific information: one did not know whether the child received breakfast at the FDCH on any day, and the other was unable to recall whether the child received an afternoon snack on one specific day during the target week. These missing data comprise only 2 percent of the 225 meal receipt data points collected in the pretest. There was no missing data on attendance during the target week.

  • SISO Logs: Even when FDCHs kept SISO logs, these logs were often missing some details. Although the SISO log forms we collected were all designed to provide information on the specific times each child arrived and departed the FDCH, this information was not always recorded for all attendees.1 In addition, because the logs were hand-written, there may have been some cases in which the information was filled in but was not legible enough to be used. These issues resulted in missing data on the precise hours children attended for 42 of the 312 daily log entries we analyzed, or 13 percent of pretest data points. Missing information on the hours children attended the FDCHs prevents accurate assessment of which meals they received there, as discussed below.

In addition, on some FDCH’s SISO logs, there is no way to determine from the logs themselves if data on whether individual children attended the FDCH at all on a specific day was missing. In other words, if a child attended the FDCH but was not entered onto the log for the day, this would be missing data, but indistinguishable from the situation in which the child did not attend that day. The one way the pretest could assess this is by comparing the log data to the observational data at an aggregate level (discussed later), and that comparison suggests that there may be some missing data on whether children attended at all.

  • Observations Data: The only missing variable in the observations data is the time the observer left the FDCH. This variable is missing in 3 of the 11 observations, or 27 percent of data points. However, this piece of information is not critical to the assessment, because the variable for the type of meal observed is never missing. In addition, because the observation forms are completed by MPR staff, the missing data could be decreased through training.

  • Administrative Claims Data: There were no individual items missing in the administrative claims data.

2. Descriptive Statistics for Each Method

In this section, we provide summary statistics and tabulations for the data collected for each method in the pretest, to facilitate informal comparisons and provide a preview of the type and quality of data attainable through each method.


Method 1. The parent survey provides information on attendance and the number of meals received, by type (breakfast, morning snack, lunch, afternoon snack, other meal), for each day during a target week. Table III.1 presents this information for the 9 children—at 5 different FDCH’s—that were included in the pretest. Sampled children attended their FDCH 3.7 days, on average, during the target week.


For each meal type, the average number of children who received a meal is lower than the number of children attending the FDCH that day. The difference is smallest for afternoon snacks (reportedly received 3.6 times during the week, on average) and largest for breakfast (received just 1.8 times). The differences between attendance and meal receipt may be due in part to FDCHs serving only certain types of meals—omitting breakfast, for example—and in part to children attending the home for only part of the day, and thus not being present at all meal times. These differences suggest caution when considering using attendance as a proxy for meal receipt, because this may result in an overestimate, which is more pronounced for less-commonly received meals, such as breakfast. To convert attendance data into reliable estimates of the number of meals received would require accurate information that was not collected in the Method 1 pretest on (1) which meals the child’s FDCH serves each day, and at what times, and (2) what time the child arrives at and departs from the FDCH.



Table III.1

Parent Interview Data


Average Across Interviews

Number of Meals Received During Target Week


Breakfasts

1.8

Morning snacks

3.3

Lunches

3.3

Afternoon snacks

3.6

Other mealsa

2.0

All Meals

14.0

Number of Days Attended During Target Week

3.7


aOther meals include supper, early morning snacks, and evening snacks. Some parents reported that their children received more than one “other meal.”


For each FDCH in the pilot, we will randomly sample approximately five children from among all those enrolled and conduct interviews with their parents. The parent survey data will then be weighted to generate estimates of the number of each type of meal each FDCH served during the target week.2 These estimates will be compared to the number of meals, by type, claimed for reimbursement by the FDCH to test the validity of Method 1. However, because the pretest surveyed only one or two purposively selected parents at each FDCH, sampling error would render such weighted averages unreliable. For example, one sampled child in the pretest was absent from the FDCH every day during the target week, which would lower the average for that home dramatically because only two parents from the FDCH were interviewed. Sampling error may be an issue in the pilot as well, but will not be as severe due to the larger sample sizes for the pilot. In the analysis of pilot study data, we will account for this form of sampling error in comparing our estimate of the true number of meals served to the number of meals claimed.


Method 2. The SISO logs kept by some FDCH’s provided a list of every child who attended the home on a given day, and, in many cases, the times that the child arrived at and departed from the FDCH. Table III.2 shows daily attendance based on the log data; one column presents simple averages across log days, while another column provides a weighted daily average in which each of the five FDCHs that provided logs for the pretest is weighted equally. About six children, on average, attended each of the FDCHs in the sample each day. Among those with valid sign-in and sign-out times, nearly three-fourths of these children attended all day, while the remainder attended only part of the day. For those children, data from the logs on the specific times of arrival at and departure from the FDCH can be used—along with information that was not systematically collected in the pretest on which meals the FDCH serves, and at what times—to determine which meals the child received that day. However, in some cases, the logs did not contain adequate information to determine whether a child attended all or part of a day. The lack of this information makes it impossible to determine accurately which meals the child received that day.



Table III.2

Sign-in/Sign-out Log Data


Average


Across Log Daysa

Across FDCHsb

Number of Children Attending



All day

3.7

3.9

Part of the day

1.3

1.5

Unknown duration

0.8

0.8

Total

5.8

6.2


aIn the averages in this column, each of the 59 logged days is given equal weight.


bIn the averages in this column, each of the six FDCHs for which SISO logs were collected is given equal weight, regardless of the number of days included in the logs.


Method 3. We cannot present data for Method 3 because no FDCH provided both parent interview data and SISO logs for the pretest. As discussed above, we discovered that logs were not kept by the FDCHs in which we had planned to test Method 3. If Method 3 is included in the pilot test, we will need to find ways of more accurately determining which FDCHs use the logs. Then, we will combine parent interview and SISO log data for those FDCHs that provide both types of data. There are several different ways to combine the two types of data: we could rely on the survey data for estimates of meals received for children whose parents complete a parent interview and only use the log data in cases where parent interviews are missing; we could do the opposite (relying primarily on the logs and using the parent data only when the log data is missing); or we could develop criteria to determine which source to use on a case-by-case basis. Experiences in the pretest suggest that relying primarily on the parent interview data may be the best choice. However, if we proceed with Method 3 in the pilot, we will gauge the sensitivity of the results to different ways of combining the data from the two sources.


In combining the two types of data, we will encounter many of the same issues as using the SISO log data alone (Method 2). Because the SISO logs do not indicate which meals children received, attendance during the day is used as a proxy for meal receipt, as discussed above. However, converting attendance data into reliable estimates of the number of meals received requires accurate information on what time the child arrives at and departs from the FDCH—which is sometimes missing from the log data—and on which meals the child’s FDCH serves, and at what times each day—which will need to be collected.


Method 4. The observations data provides the number of meals MPR field staff observed being served to children at FDCHs and is the basis for validating the other methods, as described in the next section. In addition, the observations data from the pilot will be used to develop a model to predict the number of meals served by FDCHs.


Table III.3 presents the daily number of attendees—at two points in time—and meals served for 11 meal observations at nine FDCHs. The number of children in attendance when the observer arrived at the FDCH was 6.1 on average, while the number in attendance at the observer’s departure averaged 6.6 children. The number of meals served fell between these two, at an average of 6.3 meals. The changing number of children during the time the observer was in the FDCH demonstrates the importance of having accurate data on both mealtimes and the specific times children are in the home when using attendance as a proxy for meal receipt. In addition to changes in the number of children present during the day, it is possible for a child to be present but not receive a meal; during one meal observation, for example, two children were sleeping and thus did not receive the meal served to the others.


While, the pretest data does not provide an adequate number of observations to create a model, we will use data from the pilot, along with administrative data, to develop a statistical model to predict the number of meals served based on a sample of FDCHs observed across their scheduled breakfast, lunch, supper, and snack times. This model would include characteristics of the sponsor (such as location, the number of providers under the sponsor’s supervision, and the tier status of these providers), the FDCH (including location, characteristics of the operator, the number of enrolled children, and average characteristics of these children), and the child (including age, race, gender, and family income level).



Table III.3

ObservationAL Data


Average Across Observations

Number of Meals Served


Breakfasts

0.6

Lunches

2.2

Snacks

3.5

All Meals

6.3

Number of Children Attending


At observer’s arrival

6.1

At observer’s departure

6.6


3. Performance of Data for Each Method to Validate Meal Claims

Calculating erroneous payments in the CACFP requires comparing the meal reimbursement claims submitted by FDCHs with the “true” count of meals and snacks that providers should have claimed. The data from direct observations of meal times at the FDCHs provide the most reliable and accurate assessment of the number of meals served to children and thus is the basis for validating the other methods. In this section, we comment on using the data collected to explore the validity of estimates from each method, assuming that data collected from the in-person observations is the “true” meal count. Because the sample sizes in this pretest are limited, we cannot formally evaluate the validity of each method in the pretest. In the pilot test, we will conduct such an evaluation for at least some of the methods.


  • Method 1: Validating Method 1 involves comparing observational data on the number and types of meals served to children to parent survey data on their children’s attendance at the time of the observation. The pretest data does not permit direct comparison of parent survey data to observations data. Although we have both data sources for two FDCHs, the observations data does not provide detail on individual children, so we cannot determine whether the observation of specific children is consistent with data from the interview of their parents.3 Instead, we would weight the parent survey data on sampled children to represent all children who attend the FDCH and compare that FDCH-level measure to the observations data.

However, as noted in the previous section, this is not possible in the pretest data due to the small sample sizes—we have both parent interview data and meal observation data for only two children at two FDCHs. Weighting those observations to create FDCH-level estimates would involve considerable sampling error. For example, one child was absent from the FDCH on the day of the observation; because that child was the only one sampled from that FDCH, weighting that observation would result in an estimate of no meals served at that home on that day. Thus, the pretest sample size is too low to draw any conclusions about the validity of Method 1 in estimating the number of meals served.

  • Method 2: To validate Method 2, we collected attendance recorded on SISO logs and used each child’s attendance as an indication that a reimbursable meal was received. We compared these estimates of the number of meals served to the number of meals served during an observation, for the six observations in four different FDCHs for which we have both types of data.4

The SISO logs were not consistent with the observation data for any of the meals that were observed. This suggests that the validity of Method 2 in estimating the number of meals served is low.5 Most differed by only one child, but one case had larger differences: the log data indicated two fewer children than the observation data (three and five, respectively).

In one of the six observations for which we also had SISO log data, the log did not provide the specific hours children attended on the target day, thus we could not positively determine whether the children were present during the observed meal or not. In this situation, we assumed the child attended the full day of the observation and did receive the meal. In the pretest data, the effect of this assumption could result in an underestimate of the difference between log and observation data, because the difference at the FDCH where the log did not provide information on times would have increased if some children were not present for the observed meal.

  • Method 3: To validate Method 3, we would compare the combined parent interview and SISO log data to the observational data on the number and types of meals served. We cannot do this in the pretest data, however, because, as noted in the previous section, there are no FDCHs for which we have both parent interview and SISO log data.

  • Method 4: The validity of Method 4 could not be addressed in this pretest memo due to the insufficient data points for developing a statistical model. This will be addressed in the pilot.

Feasibility of Implementing the Method on a Larger Scale

Method 1 would be the most feasible to implement on a larger scale—any number of children could be selected randomly from FDCH enrollment forms and their parents contacted to complete interviews. Method 4 also could be implemented on a larger scale, with additional observation visits made to more FDCHs. For these two methods, expanding the data collection and using a random sample may require more effort than for the convenience sample used in the pretest, but the increased logistical complications are manageable. Methods 2 and 3 would be most difficult to implement on a large scale due to the limited number of FDCHs that use SISO logs and the challenge of accurately determining which homes actually use the logs.


Relative Cost of Implementing Each Method

Method 2 would be the least expensive to implement if sponsors were willing to assist in collecting the logs from the FDCHs.6 Method 1 would be somewhat more expensive to implement because it requires a separate interview for the parent of each sampled child. Method 3 would be more expensive than both Methods 1 and 2 because it combines the two types of data collection. Method 4, which requires field staff to visit FDCHs when meals are being served, would be the most expensive to implement. The logistics of planning such visits is complicated by two factors: (1) the actual time a meal is served on a particular day often deviates from the scheduled time, and (2) sponsors are likely to require a monitor to accompany the file staff to the FDCHs, adding another person whose schedule must be accommodated.


B. Recommendations for the Pilot Test

Based on the experiences from the pretest, we can assess which of the four proposed methods have the potential to be implemented on a national scale, show promise in their ability to validate meal reimbursement claims based on the data collected through the pretest, and can be implemented within the budget constraints during the pilot test. This section first summarizes our assessments of each of the four methods, including our recommendations for which should be tested in the pilot, and then offers additional recommendations for changes in specific data collection procedures.


Methods to Include in the Pilot. Although the pretest sample size is too small to draw any strong conclusions about the validity of the tested data collection methods in estimating the number of meals served, the pretest data collection experience and preliminary analysis has been informative regarding which methods show the most promise.


  • Method 1: The parent interview data has the advantage of providing information on not only children’s attendance, but also on which meals they received each day. Parents interviewed for the pretest were able to report on their child’s daily attendance and receipt of each type of meal, although we were unable to measure the validity of the data due to the insufficient sample sizes at each FDCH in the pretest. This will be corrected in the pilot by taking random samples of approximately five parents per FDCH to weight up to home-level estimates.

  • Method 2: The relatively high level of missing data in the SISO logs—both unit missing in FDCHs that do not keep such logs and item missing in those that do—renders Method 2 less useful in providing reliable estimates of the number of meals served. In addition, comparing estimated meal receipt from this method to that from direct observations found the validity of Method 2 to be low.

  • Method 3: The data collection issues affecting Method 2 are also relevant to Method 3, because it combines Methods 1 and 2. Although we were unable to measure the validity of Method 3 in the pretest, the low validity of Method 2 suggests that combining it with Method 1 data would not be worth the additional effort required.

  • Method 4: The direct observation data collected for Method 4 is complete and considered the best assessment of the meals children actually received at FDCHs. Although we were not able to test the model development with the pretest data, the observation data collected for Method 4 is necessary to validate the other methods.

Based on these assessments, we recommend that Methods 1 and 4 be included in the pilot. Both are feasible to implement on a larger scale and can be piloted within the budget. Although the validity of these two methods could not be assessed in the pretest, each method produced fairly complete data whose validity could be tested in the pilot.


Data Collection Process Issues. Lessons learned from the pretest also will inform the specific data collection procedures to be used for each method included in the pilot.


  • Parent Interviews (Methods 1 and 3): Although the pretest sample size was too small to allow us to test the validity of parent recall data on attendance compared to data on meals received, we did find sufficient difference between reported attendance and reported meal receipt to consider it worthwhile to include questions on meal receipt in the parent interview in the pilot.

  • SISO Logs (Methods 2 and 3): If collection of the SISO logs is included in the pilot test, two changes would be necessary to enable more precise estimates of which meals each child received. First, to the extent that the logs include information on the specific hours each child attended, this detailed data would need to be entered into a data file, to measure which children were present during each meal time.7 Second, data on which meals each FDCH typically serves and at what time need would need to be systematically collected.

  • Observations (Method 4): The usefulness of the observations data would be improved if it were possible to match specific children observed during the mealtime to information on those children in parent interview and SISO log data. This would require collecting the names of each child at the observation, which might raise confidentiality concerns. Both first and last names would be required, which could be considered sensitive information and thus not be practical to collect.

  • All Methods: Pretest experiences suggest that data collection may require more effort than originally projected. In addition to the time and effort of field staff, an unexpected amount of senior staff time was necessary to negotiate with each of the sponsors, including the assignment of a monitor to accompany field staff on observations. Scheduling of each type of data collection was also more complicated than anticipated. Related to the need for flexibility in scheduling activities, requests for administrative claims data should be made after other data has been collected, to ensure that it covers the same months as other data. However, enrollment forms must still to be collected before any other data collection activities can begin.

Next Steps. After the project officer and other FNS staff review the recommendations presented in this memo, we will schedule a meeting between FNS and MPR staff to discuss which methods should be included in the pilot. These decisions will determine the approach to present in the OMB clearance package.





REFERENCES

Macaluso, Ted. “The CACFP and Program Assessment.” Presentation at National CACFP Association Conference, Orlando, FL, March 20, 2006. Available online at [www.cacfp.com/CONF06/Program%20Evaluation%20-%20OANE.ppt].

Smith, Barbara J. “Child Care Assessment Project (CCAP).” Presentation at National CACFP Association Conference, Orlando, FL, March 20, 2006. Available online at [www.cacfp.com/CONF06/Program%20Evaluation%20-%20CCAP%20-%20FNS.ppt].

U.S. Department of Agriculture, Office of Inspector General. “Food and Nutrition Service, Child and Adult Care Food Program National Report on Program Abuses.” Presidential Initiative: Operation Kiddie Care, Audit Report No. 27601-7-SF. August 23, 1999.


1 In many of these cases, the log indicated the time that the child arrived in the morning but did not indicate the time the child departed the FDCH. In some cases, neither the arrival nor departure time was recorded.

2 Each estimated weekly total will be computed by first computing the average number of weekly meals of that type per child, as reported by responding parents, served at each FDCH. We will then multiply each FDCHs average by the total number of children attending that FDCH.

3 We could explore the possibility of collecting the names of students at observations in the pilot, but confidentiality concerns may prevent this type of data collection.

4 We had both types of data for a fifth home, but not for the same day.

5 Possible reasons for the unreliability of SISO logs are suggested in Section II, including inconsistencies in who recorded data in the logs, incompleteness, and illegibility.

6 If MPR staff must visit each home to collect the logs, however, that would increase costs.

7 In the pretest, data on the exact arrival and departure times from the logs were not entered into data files. Instead we conducted most analyses relying on an assessment of whether each child was there for all or just part of the day. We looked only at the exact arrival and departure times for the cases matched to the observations data.


File Typeapplication/msword
File TitleMemo Template
AuthorWilliam Garrett
Last Modified ByAdministrator
File Modified2007-12-12
File Created2007-12-12

© 2024 OMB.report | Privacy Policy