Response to OMB Comments Regarding the Falls Information Collection Request 3-17-15
OMB:
With regard to the Post Program Survey, a key question is how ACL intends to use the information collected and what it means to “assess outcomes.” Because the grant program is composed of various falls prevention programs, it would be difficult to attempt to establish causality between the grant program and improved outcomes because there is more than one intervention being introduced. Additionally, if this is treated as a true program evaluation ACL would need to identify ways to measure fidelity and address potential threats to internal validity.
If ACL’s goal is to conduct a true program evaluation or to assess if program participants report a general change in direction for key variables (fall rates, self-efficacy, etc.), more detail is needed to describe the statistical methods that will be used including providing more justification about the use of a census design instead of sample.
ACL Response:
ACL does not intend this information collection to serve as an outcome evaluation. Rather, the data will provide a snapshot of participants’ perceptions about program benefits at the last class. ACL’s intention is to use these data to develop a more complete and accurate understanding about the perceived value and short-term results of the ACL Evidence-Based Falls Prevention grants. ACL seeks this information from the participants themselves as part of ACL’s performance monitoring of the grantees and to help inform technical assistance provided through the ACL National Falls Prevention Resource Center.
OMB:
Please provide more information about how ACL intends to achieve an acceptable response rate on the post-survey form. Presumably, response rates could be expected to be fairly high since participants will complete the form at the time of the last session (assuming that participant drop-out rates are not an issue).
ACL Response: ACL intends to use in-person administration of the information collection because research has shown that in-person survey distribution has the highest average response rates when compared to mail, e-mail, telephone, or web-based data collection. In-person data collection has also been shown to result in more complete data.
In addition, research shows that high response rates are strongly influenced by the following factors that ACL has incorporated into our approach:
The salience of the topic- Respondents are being asked about a program that they have just participated in voluntarily to address an issue, falls, that is personally relevant to them.
Personalized request and communication- Respondents are being given the information collection tool from someone that they know. They are being presented with a specific request to respond based on their experience in the course that they just completed.
Information collection tool is concise and easy to complete- Based on pretesting the post-program survey form, which is two pages and consists of 8 questions, is easy to complete requiring only 6 minutes to complete.
Information collection tool is easy to return-By administering the information collection in-person, respondents are able to return their forms immediately. They will not have to keep track of the form or remember to send it in at a later time.
Showing positive regard-Group leaders who collect the information will thank respondents for their efforts. The group leaders’ script also talks explicitly about the value of the data to ACL for making future program improvements.
Reducing non-receipt of the information collection: ACL’s in-person approach ensures that respondents receive the information collection form and, thus, reduces non-response due to non-receipt.
OMB: Based on our interpretation of the materials presented in the supporting statements, we understand that ACL is not currently planning to use sampling, having designed this survey for the universe of program participants. We are concerned that this is an overly burdensome approach for the respondents. We’d like ACL to explore developing a sampling approach for the Post Program Survey. Could ACL please prepare a sampling plan that describes the sampling selection process, including a justification of the sample size and description of the means used to select the sample?
ACL Response:
While ACL did not originally propose a sampling design, ACL does see the value in reducing data collection burden for respondents and has explored the feasibility of several designs.
Based on a review of the grant program design, ACL does not propose to conduct sampling at the individual level because it would place a high resource burden on both ACL and the local implementation sites.
Cost to ACL: There is a significant cost in terms of time and money for ACL to implement a sampling approach similar to those that it has used with recent information collections.
ACL has OMB clearance to sample respondents for both the National Survey of Older American Act Participants and for the outcome evaluation of the Title III-C Elderly Nutrition Services program. ACL has also contracted with Westat for an evaluation of the Title III-E National Family Caregiver Support Program which will involve a sampling approach similar to that used with the National Survey. These studies employ a two-stage sampling process that first samples Area Agencies on Aging or Local Service Providers and then, from the selected organizations, creates a sample of individuals. These sampling designs are created and implemented by contractors (Westat for the National Survey and the Caregiver evaluation, and Mathematica for the Nutrition Evaluation). The costs are shown below:
Study |
Contract cost |
Labor hours |
National Survey* |
$147,682 |
1,304 |
Nutrition Evaluation |
$393,697 |
1,343 |
Caregiver Survey |
$431,389 |
3,308 |
* The cost for the National Survey sampling is low compared to the other projects because this contractor has conducted this work for the past 9 rounds of the survey. As a result much of the developmental work is already completed.
At this time, ACL does not have a contractor in place to conduct this work and also does not have sufficient human resources to design and implement an individual level sampling plan without a contractor. ACL does not have funds to award a contract in its FY 2015 budget nor does ACL anticipate such funds being available in the FY 2016 budget.
Burden for grantees and subgrantees: The individual level sampling plans used for the National Survey and the Nutrition Evaluation are being implemented by contractors. They only require that providers supply lists of program participants from which the contractors select the samples. The contractors cannot do this in real time, so they then must contact the selected respondents to conduct telephone or in person interviews at a later date. Since there is no contract in place for sampling or data collection with the Evidence-Based Falls Prevention grantees, the providers themselves would have to do this work. Those sites have neither the training nor capacity to do this work. Another burden for the sites would be the necessity for collecting contact information for program participants. This is data that is not included in the previously OMB approved pre-test survey. Collection of this contact information is not something that they currently do, nor do the grantees have the data security measures in place to protect that sensitive data.
It would be possible to construct a probability sample at the class level. ACL estimates that there will be approximately 1,150 classes held under this grant program. As the purpose of any sampling approach is to use a small number of objects, classes in this case, to represent the larger group from which they are drawn, ACL could construct a stratified random sample. ACL would stratify the sample by grantee type because previous ACL research shows that Aging grantees (State Units on Aging and Area Agencies on Aging) had higher completion rates than did Public Health grantees and classes specifically targeted to a particular racial/ethnic group had higher completion rates than did other classes. Completion rates are of particular importance to ACL because previous evaluations of these programs show that people that complete the programs have better outcomes than do non-completers. There are currently four grantee types:
Tribes (4 grantees)
State Units on Aging (4 grantees) / Area Agency on Aging (1 grantee)
State Health Departments (3 grantees)
Foundations (2 grantees)
ACL estimates that we would need a sample of 725 classes (63%) for a confidence level of 95% and confidence interval of 2.21 based on the following assumptions:
There is no systematic difference across classes in terms of gender or age.
Approximately, 10% of proposed classes will be cancelled requiring a 10% oversampling.
Classes will include an average of 12 participants with little variation between classes
The primary disadvantage to sampling rather than surveying all program participants is that we may miss important subgroups of participants and site types. The process evaluation report for the Chronic Disease Self-Management Education program1 shows that there are significant differences in program completion across the following categories:
Experienced leaders had higher completion rates than new leaders
Rural sites had higher completion rates than urban sites
Smaller classes had higher completion rates than larger classes
Classes held at faith-based organizations had the highest completion rates while classes held at residential facilities had the lowest
Stratification to account for all of these features would be overly complex and likely result in cell sizes too small to analyze. As a result, ACL realizes that the final sample may not be big enough to allow us to conduct analysis by these sub-groups.
Unfortunately, ACL does not currently have sufficient information about the universe of classes to construct a workable sampling plan as the class level. The Falls grant program was designed following the approach used successfully for the “Empowering Older Adults and Adults with Disabilities through Chronic Disease Self-Management Education Programs” grants. Under both grant programs, grantees receive funds that they distribute to local providers who provide the actual classes to individuals. Providers are not currently required to report class details prior to holding the classes. In some cases, providers do not report any class details, including how many classes they will hold, until they submit invoices to the ACL grantees. As currently structured, ACL would have to rely on providers to sample from their own classes with little to no oversight, which introduces an enormous potential for bias.
As a result, ACL proposes a two-step process for implementing a sampling plan:
ACL will implement a limited convenience sampling procedure immediately. Specifically, ACL will:
work with the grantees to confirm which sites are able to submit class lists prior to holding classes;
select 50% of the classes to participate in information collection using the ‘Post Survey.’ Because of the small number of participants served through the classes held by Tribal grantees and the significance of this population to the “Empowering Older Adults and Adults with Disabilities through Chronic Disease Self-Management Education Programs”, the Title VI Tribal Grants Program and ACL as a whole, classes offered through the Tribal grantees will be selected with certainty.
review the process and make revisions to ensure that this is a sound approach before it is rolled out with all sites.
ACL will incorporate language in the Year 2 continuation Notice of Awards and future Falls grantee award notices requiring sites to submit class lists to ACL prior to holding classes so that ACL can select a random sample of 63% of the classes that will be asked to collect participant data using the ‘Post Survey.’
Sources used:
University of Wisconsin Extension Program. (2009). Response rate in surveys.Downloaded 2/26/2015 from http://www.google.com/url?sa=t&rct=j&q=&esrc=s&frm=1&source=web&cd=1&cad=rja&uact=8&ved=0CB4QFjAA&url=http%3A%2F%2Fwww.uwex.edu%2Fces%2F4h%2Fevaluation%2Fdocuments%2FResponseRateInSurveys.ppt&ei=XUzvVM4c0bWwBMXdgfgK&usg=AFQjCNHVi_4-ZJ2-Umpl8zg7d3fMPZSA5w
Baruch, Y. & Holton, B.C. (2008). Survey response rate levels and trends. Sage Publications. Downloaded 2/26/2015 from http://hum.sagepub.com/cgi/content/abstract/61/8/1139.
Rolnick SJ, Gross CR, Garrard J, Gibson RW. (1989). A comparison of response rate, data quality, and cost in the collection of data on sexual history and personal behaviors. Mail survey approaches and in-person interview. American Journal of Epidemiology.129(5):1052-61.
Sample size calculator: http://www.surveysystem.com/sscalc.htm
.
1 Report available at http://www.aoa.acl.gov/Program_Results/docs/CDSMPProcessEvaluationReportFINAL062713.pdf
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Susan Jenkins |
File Modified | 0000-00-00 |
File Created | 2021-01-25 |