CACFP_MCS OMB Part B 0930 Final

CACFP_MCS OMB Part B 0930 Final.docx

Child and Adult Care Food Program (CACFP) Family Day Care Homes Meal Claims Feasibility Study

OMB: 0584-0623

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT - PART B for

OMB Control Number 0584-NEW

Child and Adult Care Food Program (CACFP) Family Day Care Homes Meal Claims Feasibility Study



Chanchalat Chanhatasilpa, Ph.D.

Social Science Research Analyst

Special Nutrition Evaluation Branch (SNEB)

Office of Policy Support (OPS)

USDA, Food and Nutrition Service

3101 Park Center Drive

Alexandria, Virginia 22302

Email: [email protected]

Phone: (703)305-2115




Contents


Section B Page



Exhibits


Exhibit b-1. Power as a Function of Number of Providers 10

Exhibit b-2. Strategies to Cope with Possible Missing Data 12


Appendices


B-1. Recruitment Letters

B-1-1. Recruitment Letter to Regional Office

B-1-2. Recruitment Letter to State Agency

B-1-3. Recruitment Letter to CACFP Sponsor

B-1-4. Recruitment Letter to FDCH Provider

B-1-5. Recruitment Letter to Parent

B-1-6. Recruitment Letter to FDCH Provider in Spanish

B-1-7. Recruitment Letter to Parent in Spanish

B-2. Telephone Script for Recruitment and Data Request

B-2-1. Telephone Follow-up Script to Recruit State Agency

B-2-2. Telephone Script to Request Provider Records from the Sponsor

B-2-3. Telephone Script to Request Administrative Data from Sponsor

B-2-4. Telephone Script to Request Meal Claim Data from Sponsor

B-2-5. Telephone Script to Recruit FDCH Provider

B-2-6. Telephone Script to Recruit FDCH Provider in Spanish

B-2-7. Telephone Script to Recruit Parent

B-2-8. Telephone Script to Recruit Parent in Spanish

B-3. Glossy Fact Sheets

B-3-1. Glossy Fact Sheet for CACFP Sponsor and FDCH Provider

B-3-2. Glossy Fact Sheet for Parent

B-3-3. Glossy Fact Sheet for CACFP Sponsor and FDCH Providers in Spanish

B-3-4. Glossy Fact Sheet for Parent in Spanish

B-4. Consent Letters

B-4-1. Consent Letter for Provider

B-4-2. Consent Letter for Parent

B-4-3. Consent Letter for Provider in Spanish

B-4-4. Consent Letter for Parent in Spanish  

B-5. NASS Review Comments



Section A

A-1. Improper Payments Elimination and Recovery Act of 2010 (IPERA)

A-2. Improper Payments Elimination and Recovery Improvement Act of 2012 (IPERIA)

A-3. Study Objectives and Research Questions

A-4. Summary Report of a Pilot Study

A-5. Public Comment

A-5a Public Comment #1

A-5b Public Comment #2

A-5c Public Comment #3

A-6. Response to Public Comment

A-6a. Response to Comment #1

A-6b. Response to Comment #2

A-6c. Response to Comment #3

A-7. Provider User Guide for Meal Service Reporting System (MSRS)

A-8. Parent User Guide for Child Attendance Reporting System (CARS)

A-9. Provider User Guide for Meal Service Reporting System (MSRS) in Spanish

A-10. Parent User Guide for Child Attendance Reporting System (CARS) in Spanish

A-11. Estimated Burden of the CACFP Meal Claim Feasibility Study (Excel)




B1. Respondent Universe and Sampling Methods


Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The respondent universe includes State agencies, family day care home (FDCH) providers participating in CACFP, and parents whose children are enrolled in these FDCHs. The study sample is designed to test the reliability and feasibility of data collected using two customized systems, the Meal Service Reporting System (MSRS) and the Child Attendance Reporting System (CARS), to estimate the scope of improper payments for CACFP meal claims. As a feasibility study, the study sample is not intended to represent CACFP participants at the national level. Rather, it is intended to assess the ability of providers and parents to use the two systems regularly and consistently, and determine whether the CARS and MSRS-derived data can dependably and precisely estimate improper payments. Specifically, the study will include 150 FDCH providers and 600 parents whose children are enrolled in those FDCHs. These FDCHs are supervised by 15 sponsoring organizations in two States.


Participation to the study is voluntary; however, sponsors and FDCH providers are strongly encouraged to participate per Section 305 of the Healthy Hunger Free Kids Act of 2010. We expect a 75 percent response1 rate from the State agencies, an 80 percent response rate from sponsoring organizations, an 86 percent response rate from providers, and an 80 percent response rate from parents, for an overall response rate of 80 percent for the entire collection.


Within each sponsoring organization, we will sample 20 providers, of which 10 will be assigned to the study group and the other 10 to the control group. We will include all children attending the selected FDCHs, except the providers’ own children. Assuming an average of five families per provider (according to recent administrative data), the study group will contact an initial sample of 750 parents (5 parents x 10 providers in the study group x 15 sponsors) and expect that 600 of them (80 percent) will agree to participate and provide responses during the study month. The control group, which represents the business-as-usual condition, will consist of a similar number of providers and families, although they will not be aware of this study, nor will they actively participate in the study. The researchers will use extant administrative records, including child enrollment and meal claim records for the control group in the two pre-study months and the one study month, as well as similar administrative data for the study group. These administrative data will allow us to see the changes in meal claims before and after using MSRS and CARS in the study group. They will also allow us to see how such changes in the study group are affected by unknown external factors; that is, what would have happened without MSRS and CARS, by comparing the changes between the study and the control group in the pre-study and study month.


During a pilot test in September 2015 (see section B.4 for more detail), providers using MSRS had an 85 percent response rate and parents using CARS had a 93 percent response rate. For the feasibility study, the contractor, Manhattan Strategy Group (MSG), expects providers to have similar response rates because their participation is required, but expects a reduction in the parents’ response rate to about 80 percent. The contractor will send recruitment materials to explain the importance of provider participation in the study, conduct follow-up calls with them, and set up a study hotline to answer their questions. We will also work with CACFP-sponsoring organizations to convey the importance of this study for CACFP to FDCH providers. We will engage providers to request their help in recruiting parents, asking them to remind parents of the importance of their participation during the study month. Providers and parents will receive incentives to cover incurred costs associated with the study, as explained in section A.9.
We describe strategies to maximize response rates in section B.3.


B2. Procedures for the Collection of Information


Describe the procedures for the collection of information including:



  • Statistical methodology for stratification and sample selection


MSG will recruit a sample of providers and parents to test the feasibility of using MSRS and CARS, and the resulting ability to use the data for assessing improper payments. However, as a feasibility study, the sample need not be representative of the population of FDCH providers in CACFP or parents whose children are receiving CACFP benefits. Providers and parents will participate in the study voluntarily.


Providers and parents will be clustered by State and sponsoring organizations that supervise the FDCHs of participating providers. The recruitment steps follow.


  1. Select States: The contractor will recommend two States to FNS to participate in the feasibility study. One of the States should be a large State as measured by CACFP program participation of FDCHs. FNS will then provide insights into conditions that might warrant changes, such as unique issues with these States that might affect their participation. Upon FNS’s approval of the proposed States, the contractor will contact the FNS Regional Offices (FNSROs) to facilitate contact with the two selected States.

  2. Select Sponsors: We will then purposefully select 15 sponsors from the participating States to create a sample that includes sponsors of varying sizes and geographic locations within each State. We will use administrative records from the participating States to categorize sponsors by size and geographic location, and randomly select a proportional number of sponsors in each category.

  3. Select Providers: Within each selected sponsor, the contractor will randomly select 10 providers and assign them to the study group, yielding a total of 150 providers as indicated in the burden table (Exhibit a-1 in Part A). We will also randomly select the same number of providers for the control group. We will send the consent form in the study package to the providers in the study group (Appendix B-4-1 and B-4-3), along with a stamped envelope for them to return the forms. There is no study burden on the providers in the control group, and they will not be informed about this study.

  4. Identify Parents: Parents whose child(ren) receive care from the participating providers in the study group will be automatically eligible to participate in the study, although they will also be given the opportunity to opt out. Providers will be asked to encourage parents to participate. Parents may provide oral consent in the follow-up calls or express their desire not to participate. We will send the consent form in the study package (Appendix B-4-2 and B-4-4), along with a stamped envelope for parents to return their consent forms.





  • Estimation procedure


The erroneous payment dollar amount will be calculated based on descriptive statistics for improper claims that providers in the study group make during the two pre-study months. The purpose of these estimates is to determine the feasibility of deriving estimates based on the data collected in this study.

  1. For each group (study or control group), sum meal claim numbers for the two pre-study months and the one study month to produce three monthly total numbers of claims.

  2. For each group, subtract the study month total claim numbers from each of the two pre-study months total claim numbers to obtain two sets of between-month differences (study month vs. pre-study month 1 and study month vs. pre-study month 2).

  3. Subtract each set of the control group’s between-month differences from the corresponding set of the study group’s between-month differences, to obtain two sets of difference-in-differences estimates (group difference in differences of study month vs. pre-study month 1 and of study month vs. pre-study month 2).

  4. Average the two sets of the difference-in-differences estimates to obtain the mean monthly over-claim numbers.

  5. Multiply the appropriate per-meal rate by the over-claim number in each of the three meal categories to obtain the monthly erroneous payment dollar amount in each category; and sum the results to obtain the monthly erroneous payment dollar amount for the sample.

The purpose of this study is to design and test a method to accurately estimate the rate of erroneous payment and provide feasibility analysis based on the test results. The study is not designed to generate a national estimate of erroneous payment; therefore, the states selected for this study are not necessarily representative of all states. Similarly, the estimation of erroneous payment in the two participating states will not represent that of all states because payment errors may vary among states.


  • Degree of accuracy needed for the purpose described in the justification,


We conducted a power analysis to determine the sample size for testing the effectiveness of MSRS and CARS. The power analysis revealed the minimum number of participants the study would require for an 80 percent chance of correctly rejecting the null hypothesis, that is, that the data collection methods/tools (MSRS and CARS) are not effective in validating meal claims because the group differences in meal claims do not demonstrate a change of
at least 0.35 standard deviation between the pre-study and study months. We specify this moderate effect size based on an accepted social science research convention.2 Assuming providers in the study group and the control group have similar meal claim patterns prior to the study months and such patterns are stable over time, if the meal claims submitted by providers in the study group during the study month yield a smaller amount of reimbursement (i.e., 0.35 standard deviation less than the control group), we would conclude that data collected from MSRS and CARS can be used to generate proxy measures of meals actually served (i.e., the “gold standard”). We will then use the proxy measures to calculate the improper claims submitted in the previous months by providers in the study group. To detect meal claim differences between pre-study and study months less than
0.35 standard deviation, if such differences exist, a larger sample size would be required.


Exhibit b-1 shows the relation between the number of providers and the level of statistical power we can achieve. We assume a conventional type I error, α=.05; a minimum detectable effect size δ=.35; and the proportion of the variance explained by the model, R squared=.10. To achieve a power at or above the conventional .80 level, the study needs at least 230 providers in both study and control groups, that is, the horizontal line for .80 power (y-axis) intersects with vertical lines for approximately 230 or more providers (x-axis). Our sample includes 300 providers, 150 in the study group and 150 in the control group, which provides a sufficient sample to detect the group difference, assuming there is any.


Exhibit b-1. Power as a Function of Number of Providers





  • Unusual problems requiring specialized sampling procedures


There are no unusual problems that require specialized sampling procedures.



  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden


The study group will use the MSRS and CARS on a daily basis (during business days) throughout the study month.


B3. Methods to Maximize the Response Rates and to Deal with Nonresponse


Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


MSG will maximize the response rates through a combination of thorough project staff training, sound recruitment materials, comprehensive data collection monitoring, and ongoing communication with participants during the study month.


A senior project staff member will train the project team to provide technical assistance to parents and providers during the field work. The data collection support team will endeavor to answer any questions participants may have about how to use CARS and MSRS. The team will be familiar with typical challenges that participants may encounter in using the system (e.g., troubleshooting passwords, correctly submitting data, or changing accidentally entered data).


An online dashboard has been developed to monitor data collection during the study month. Staff members will be trained to use the dashboard, and employ the follow-up telephone scripts to contact providers and parents who have not completed their data submissions in a timely manner. The data collection support team will undergo a one-day internal training module on the systems and procedures, including sessions focused on non-response follow-up and refusal conversion. PowerPoint presentations for the training will present broad concepts and the sequence of data collection operations. Role-playing will be used to train staff on refusal conversion techniques and to work through challenging non-response follow-up scenarios for CARS and MSRS.


The contractor will train project staff to support field queries from parents who may contact the study’s hotline. Project staff will receive training in respondent retention techniques to address respondent concerns, convert refusals, and help participants complete the data collection. The study team’s provision of immediate and directed assistance when requested will likely minimize any frustration with the data collection operations.


Although we will work closely with participants in the study to minimize non-response, there are a number of conceivable scenarios in which non-response or missing data might occur. We describe each scenario, along with the coping strategy with which we would respond to it, in Exhibit b-2.

Exhibit b-2. Strategies to Cope with Possible Missing Data



Cause of missing data

Time when missing data occurs

Coping strategy

Study group

Meal claims

Sponsor fails to provide claims in full

Pre-study months

Random replacement of the provider

Meal claims (MSRS)

Incomplete claims (provider stops claims or stops operation)

Early study month

Remove provider; providers have been oversampled

Late study month

Impute based on earlier data of the given provider

Child left the selected home care

Early study months

Exclude child from study

Late study month

Impute based on earlier data of the given child or the average of other children in the same home care

Parent report (CARS)

Parent fails to report consistently

Study month

Impute with available data for the same weekday or with attendance information from the administrative record

Parent drops out of study

Early study month

Exclude parent/child from study


Late study month

Impute with available data for the
same weekday or with attendance information from the administrative record

Administrative data


Provider record incomplete

Study month

Directly contact sponsors to complete the record

Child record incomplete or missing

Study month

Exclude child, who is likely to be a newcomer with parents not participating in the study

Control group

Meal claims

Sponsor fails to provide claims in full

Pre-study months

Random replacement of the sponsor

Incomplete claims (provider stops claiming or stops operation)

Pre-study months

Remove provider; providers have been oversampled

Study month

Remove provider; providers have been oversampled



B4. Test of Procedures or Methods to be Undertaken


Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Appendix A-4 provides a summary of the pilot test of MSRS and CARS in 2015, which includes cognitive interviews with two FDCH providers and nine parents in Maryland, a
month-long pilot test with four FDCH providers and nine parents in Texas, and exit interviews with the providers and parents who participated in the pilot test in Texas. The cognitive and exit interviews used different interview protocols with distinct questions for providers (who used MSRS) and parents (who used CARS), respectively. Providers and parents participating in the pilot test used MSRS or CARS for data collection, respectively, not both.


Recruitment

The recruitment process for the pilot study began with our obtaining permission for the pilot study and seeking State CACFP agency contact information from FNS Regional Offices (FNSROs) for Maryland and Texas, the two States participating in the pilot study. The MSG team developed a request letter, which FNS reviewed and approved. The request letter provided a detailed overview of the study and the technology platforms to be piloted, and included a request for support for the study. FNS sent the request letter to the FNSROs on behalf of the study. Within two weeks, the study team received formal permission to proceed, as well as the State CACFP agency contact information.


With permissions from FNSROs, the study team prepared and sent a study invitation letter to the respective CACFP State agencies in Texas and Maryland. In the letter, we provided an overview of the study and the technology platforms to be tested. We also requested the State agency’s assistance in identifying up to three sponsoring organizations for participation in the cognitive testing or the pilot test in September. We made this request because there is no national database of CACFP participants that we can use to identify potential sponsors for the feasibility study. We asked State agencies to provide the sponsor name and contact information and the number of currently active FDCHs associated with each of these sponsors, by tiering status, to help us select diverse sponsoring organizations for the pilot study.


Both Maryland and Texas provided the contact information for three sponsoring organizations for the pilot study. We developed a set of study invitation materials with which sponsoring organizations could provide a more detailed description of the rationale for the pilot study, the technology platforms, and pilot test activities. The invitation letter also requested information for 10 FDCHs that are currently active and claimed reimbursement in May 2015
(or the most recent month available) with enrollment of at least five children. Specifically, we asked for these FDCHs’ contact information, tiering status, and enrollment information. We requested that sponsoring organizations provide the necessary information electronically or in paper format within a three-week period.


The study team conducted follow-up calls with sponsors in Maryland and Texas to confirm their receipt of the letter, review the data request, answer questions about the study, and plan for receipt of the requested information. We sent letters to all six sponsoring organizations via email, inviting the sponsors to participate in the pilot study on a voluntary basis. We followed up with the sponsors via phone. One sponsor in Maryland required approval from the Institutional Review Board; MSG, in consultation with FNS, determined that the timeline for the pilot study would not permit the time required for a full, human subject review process, and decided not to pursue that sponsoring organization any further. MSG is currently seeking IRB approval for the full study. Two sponsors in Texas turned down the invitation citing concerns over potential burdens. In the end, two sponsors in Maryland and one in Texas agreed to participate and provided the requested information for FDCH providers.


We requested and received a list of 20 total FDCHs from the two Maryland sponsoring organizations. After reviewing the FDCHs, the study team prioritized those FDCHs with more than seven children enrolled,3 and with diverse tiering status, length of program participation, and methods providers used for claiming meals. We took these steps to ensure that those selected for the cognitive testing would be a diverse set of providers with robust enrollments to support the selection of parents for the cognitive testing. From among this pool of FDCHs, we reached out to 10 providers and successfully recruited two to participate in the
cognitive testing.


Parents were invited to participate in the cognitive testing if their children were enrolled in the FDCHs and study materials including an invitation letter describing the study and the cognitive interviewing, as well as a list of FAQS were sent to parents to review. The study team followed up with parents to confirm their receipt of the letter and to answer their questions about the interviewing. Twenty-nine parent invitations were sent and nine parents were successfully recruited for the cognitive testing, although these efforts took longer than initially planned. Cognitive testing with providers was conducted during the first week of August 2015. Cognitive testing with parents took place between August 11 and August 17, 2015.


The recruitment of providers and parents for the September pilot test in Texas began at the end of July. After the sponsor in Texas made contact with providers for the study, the study team sent the study materials to four providers. These materials included an invitation letter and a list of FAQS to help providers understand the pilot test. All four providers agreed to participate in the pilot test.


Study invitations were then sent to parents for the September pilot test. The parent recruitment for the pilot test took place between August 7 and August 24, 2015. A total of nine parents eventually agreed to participate.


Cognitive Test

MSG conducted a cognitive test to determine how users can readily understand the functions and designs of the data collection instruments, and to collect feedback from users to further improve these functions. During the cognitive test, we conducted phone interviews with nine parents and two providers to collect feedback on the data collection instruments, which included invitation letters to the feasibility study, user guides, and mock-ups of CARS and MSRS interfaces. We asked participants in the cognitive tests how comprehensible the language was, how intuitive the features and functions of CARS and MSRS were, and whether they had any suggestions for changes.


Each interview lasted 30–60 minutes. During the interviews, we used a semi-structured interview protocol and took notes on participants’ responses. After the interviews, we synthesized feedback from interviewees and conducted a thematic analysis of the responses to identify areas for improvement or clarification.


Pilot Test

The study team conducted a pilot test to assess and test the functionality of MSRS and CARS prior to the feasibility study. During the pilot test, four providers and nine parents in Texas used MSRS and CARS, respectively, to report meals served and child attendance in September 2015.


Exit Interview

The contractor conducted exit interviews with the participants during the week following the pilot test to collect feedback on their experience using MSRS and CARS. We used a semi-structured interview protocol to gather feedback from providers and parents on challenges and barriers to their daily use of CARS and MSRS. We conducted the exit interviews with providers and parents by telephone; these lasted 30–60 minutes. Results of the cognitive interviews, pilot test, and exit interviews informed the final development of CARS and MSRS to ensure that the instrument is readily understandable and easily accessible. Providers and parents find MSRS and CARS easy to learn and intuitive to use. They are able to quickly integrate these technologies into their daily routine and report meal serving time and child attendance on a daily basis. Parents and providers can access and use the technology platforms through different mobile phone systems and web browsers. No users experienced technical issues due to system failures during the pilot test month.


The outcomes of the pilot study provide strong evidence that MSRS and CARS are effective tools for capturing meal serving time and child attendance. Such information is critical for the feasibility study to estimate meal claim errors. The pilot test also validates our assumption that it is reasonable for providers and parents to implement and use MSRS and CARS. We can confirm that providers and parents in general have easy access to mobile phones or the Internet, and were willing to use these for the purpose of this study. A majority of providers exclusively used smartphones to access MSRS. The pilot test also confirms the importance of having a
web version of MSRS; indeed, one provider exclusively used the reporting website via a laptop computer. Almost all parents prefer to use CARS via mobile phone, primarily due to the easy access it affords to text messaging. This suggests that the primary purpose of a web
application for parents would be as a backup in the unanticipated event of a missing or nonfunctional phone.


B5. Individuals Consulted on Statistical Aspects & Individuals Collecting and/or Analyzing Data


Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


FNS has contracted with the Manhattan Strategy Group (MSG) to conduct the feasibility study. MSG will be responsible for all data collection and analysis.

Daniel Geller, Ph.D., Director of Evaluation Services, Manhattan Strategy Group

Email: [email protected]

Office: 301-828-1348


Ying Zhang, Ph.D., Project Manager, Manhattan Strategy Group

Email: [email protected]

Office: 301-828-1346


Jared Pratt, Mathematical Statistician

Summary, Estimation, and Disclosure Methodology Branch

National Agricultural Statistics Service

United States Department of Agriculture

Telephone: 202-720-2839

1 Since the study requires reaching out to only four State agencies, it is not possible to attain the desired 80 percent response rate and a 100 percent response rate is not realistic.

2The effect size in this analysis may be measured with β coefficient (standardized b coefficient) associated with the study group status, D, in the regression analysis. It indicates the magnitude of the bi-serial correlation between the dichotomous study group status and the continuous measure of meal claims.

3 We prioritized our sampling list to first contact those providers with more than seven children enrolled to ensure a robust parent sample for the next phase of recruitment for the cognitive testing. We believed the increased enrollment would offset cases when the provider’s enrollment reflected parents of multiple children, thus reducing the number of actual parents in the home. This approach was also established to minimize the burden for the FDCHs, as our recruitment strategy was developed to minimize the chance we would need to go to more homes
for the cognitive testing due to a limited number of parents.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement for OMB No
AuthorUSDA
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy