CMS-10632.Supporting Statement Part B

CMS-10632.Supporting Statement Part B.docx

Evaluating Coverage to Care (C2C) (CMS-10632)

OMB: 0938-1342

Document [docx]
Download: docx | pdf

Evaluating Coverage to Care


Supporting Statement Part B for OMB Approval


(CMS-10632: 0938-New)



B. Collection of Information Employing Statistical Methods


B.1. Respondent Universe and Sampling Methods


In this section, an overview of the respondent universe and study population for each data collection component is provided. The relevant procedures for identifying the study population and the data collection procedures are discussed. There are no unusual problems requiring specialized sampling procedures.


B.1.1. Overview of Respondent Universe, Study Population, and Expected Response Rates


This study will collect information through four components in four modes of data collection. These include: (1) an online cross-sectional survey of the universe of organizations that have placed Coverage to Care (C2C) product orders; (2) a cross-sectional survey of consumers selected from the Knowledge Networks panel; (3) semi-structured interviews with staff from a limited set of community organizations as part of a case study; and (4) focus groups of consumers as part of a case study. The case study will be conducted in a community where English is not the preferred language, and where C2C materials were ordered another language (e.g., Spanish, Arabic, Chinese, Haitian Creole, Korean, Russian, and Vietnamese) for consumers.


Administration of the structured online survey of organizations that have placed C2C product orders is necessitated by the expectation of a high degree of variability in their types (e.g., health providers, navigator programs, faith-based organizations, community-based non-profits), sizes (e.g., number of locations, number of consumers served), and the uses they make of C2C materials (see Section A.1). To ensure that respondent burden is reasonable, the survey will be limited to 20 minutes on average, which includes an estimated 5 minutes to review the purpose of the study and to identify the most appropriate person within the organization to complete the survey. Additional information and justification on pursuing the universe of organizations is provided in Section B.1.2.


The consumer survey will be administered through Knowledge Networks. Knowledge Networks offers a nationally representative, probability-weighted panel of individuals aged 18 or older who participate in regular surveys over the Internet. Panel members are provided with technology to respond to surveys to assure representativeness of the whole U.S. population and not just those with Internet access.1 Individuals who are between the ages of 18 and 64 (inclusive), not eligible for Medicare, and with incomes below a specified threshold will be eligible to be surveyed. Knowledge Networks collects demographic and other data from its panel members annually; we will leverage these data for this study to minimize respondent burden (See Appendix B). We anticipate that the survey will last 20 minutes.


C2C materials have been translated into eight languages. CMS is interested in learning how these materials are being used and the impact they are having in communities where English is not the primary language. The two remaining data collection efforts include semi-structured interviews with key staff from ordering organizations and consumer focus groups. These two efforts will contribute to a case study that addresses the study’s priority research questions in a community where English is not the primary language. We anticipate the semi-structured interviews to last approximately 45 minutes, and the focus group to last 60 minutes.


In sum, the structured surveys of organizations and consumers, the semi-structured interviews, and the focus group will maximize the usefulness of the information collected. These four data collection efforts will best inform CMS OMH in its efforts to understand how C2C is being disseminated and implemented across the U.S. and the impact it is having for community organizations and consumers. The approaches we describe will help CMS identify strategies for capacity building and programmatic improvement of C2C moving forward. Exhibit B.1.1 summarizes this information for each data collection component.


Exhibit B.1.1 Respondent Universe, Study Population, and Expected Response Rate by Data Collection Component

Component

Respondent Universe

Study Population

Expected Response Rate

Online Survey of Organizations

Organizations placing C2C product orders between 6/2014 and date of study launch

Respondent universe; approximately 3,100 as of June, 2016

15-20%

Online Survey of Consumers

Knowledge Networks panel members who meet study criteria

Stratified sample. Knowledge Network will invite participation from roughly 4700 individuals on its panel until the target sample of 2800 is reached

60%

Case Study: Semi-Structured Interviews of Staff

Organizations within case study community that have placed product orders in a language other than English

Sample of 5-10 key staff within community organizations, per case study*

70–85%

Case Study: Focus Group of Consumers

Consumers residing in the case study community who received C2C materials

Sample of 15-25
consumers per case study

50-70%

*We currently anticipate conducting one case study. While the organization sample size falls within the OMB allowance of nine or fewer interviews, we include this data collection element for approval in the event that more than one case study is pursued.


Online Survey of Organizations. The universe of organizations will be identified through the most recent product ordering data collected by CMS. This database provides contact information for the person placing the product order, as well as the name and location of the organization and the types of products ordered. According to our most recent estimates from June 2016, 3,108 distinct organizations have placed product orders, with between 300-400 orders being placed every quarter (some of these are repeat orders but organizations will be asked to fill out only one survey). The person who placed the product order will be contacted by email, with follow-up contact as needed. As part of the invitation, we will provide a description of the roles and responsibilities of the individual best suited to complete the survey. If the person who placed the order is not the best suited to answer the questions, they may respond to the invitation with a suggested name and email of an alternate contact. With this information, the recommended point of contact will receive an email with an invitation to participate in the study (recruitment scripts can be found in Appendix H). The representative who responds on behalf of the organization will be asked to report information about the ordering organizations, how they learned about C2C, the perceived value of the materials, how they disseminate materials to consumers, any lateral interactions with other organizations in their communities relevant to C2C or its topics, what additional resources would be useful, and what modifications they would like to see to existing C2C materials. Based on similar cross-sectional online surveys, the response rate for this survey is expected to be in the range of 15 to 20 percent.


Online Survey of Consumers. Respondents will be drawn from the Knowledge Networks Knowledge Panel®, a probability-based panel whose members are randomly selected through random-digit dialing (RDD) or address-based sampling (ABS).2 The sample frame of residential addresses covers approximately 97% of U.S. Households. Samples are drawn from among active members using a probability proportional to size (PPS) weighted sampling approach. Individuals may join the panel only after being randomly selected; no one is allowed to “opt in” to the panel. Panel members may complete a maximum of one survey per week; most complete about two surveys per month. To maximize efficiency of the survey sample, we will use a stratified random sampling plan that is based on RAND estimates of “C2C saturation” in ZIP codes across the U.S. Additional information on this stratification and sampling plan is provided in Section B.1.2. Invitations for participation, as well as compensation for participation in the study, will come directly from Knowledge Networks. We require a sample size of 2800 to achieve adequate power.


Case Study: Semi-structured interviews of key staff from organizations and focus group of consumers. Semi-structured interviews will be conducted with 5-10 key staff of 3-5 organizations that ordered C2C materials and are located within a case study community. Focus groups will be conducted with 8-12 consumers located in the same community. Working in partnership with ordering organization(s) in the selected community, we plan to invite 15-25 consumers and recruit 8-12. We may conduct two smaller focus groups (2 groups of 5 or 6 individuals), if needed, to accommodate schedules. Interviews and focus groups will be conducted by local professional facilitators fluent in the language of the community. Quality assurance will be provided by RAND, which has staff fluent in each of the potential languages for the case study. Once the location and language of the case study has been identified, interview and focus group materials will be translated and back-translated using standard procedures. Materials are provided in English for OMB review.


B.1.2. Statistical Methodology for Stratification and Sample Selection and Degree of Accuracy Needed


Online Survey of Organizations


Statistical methodology for stratification and sample selection

The administration of the structured online survey to the universe of organizations that have placed C2C product orders is necessitated by the expectation of a high degree of variability across these organizations in the type of organization they are (e.g., health provider, navigator program, faith-based organization, non-profit), their size (e.g., materials are being used in a single location or shipped to multiple locations across the state), and ways in which they use and disseminate C2C materials to consumers and other organizations within their community (see Section A.1). Given how product orders are placed within CMS, it is not possible to know anything about the organization placing the order, aside from the organization name and location. Thus, there is not sufficient information to establish a sampling frame that would capture important subgroups of organizations (e.g., type of organization, population served). For example, our assessment of the organization type, based on CMS product ordering data, classified 77.2% of the organizations as “Other”, meaning that we could not tell, based on the name or location of the organization, anything about the organization. If a sampling frame were to be created with the limited information that currently exists, the frame is presumed to be insufficient for providing a representative sample of important subgroups. Collecting key characteristics of the organizations themselves is also helping us to answer a key research question for this study: “How successful was C2C spread and uptake?” Because CMS currently has little information about the types of organizations placing product orders, this information will be a valuable asset for CMS as it can help to inform a sampling frame for future outreach and data collection efforts of specific types of organizations or sectors. For this reason, a sample-based approach would severely limit the usefulness of the data collection effort for CMS OMH. Therefore, the universe of organizations that have placed C2C orders will be asked to participate in the evaluation of C2C.


Degree of accuracy needed

This component of the study is descriptive and will include description of outcomes within specific types of organizations to gain a better understanding of factors driving the spread and uptake of C2C (research question 1) and what organizations did with the C2C materials and messages (research question 2). As such, we will not be conducting statistical tests of group differences and do not have a required sample size to ensure adequate power. However, we do anticipate conducting subgroup analyses, where possible, to gain a better understanding of the spread and uptake of C2C (research question 1) and what organizations did with the C2C materials and messages (research question 2). Our anticipated response rate of between 15 and 20% of those organizations that have ordered C2C materials and are invited to participate in the survey translates into an expected number of completed surveys of roughly 600 to 800, if we assume that the total number of ordering organizations grows by another 50% by the time of the survey’s launch (and thus a total of roughly 4600 organizations comprise the full population, or universe). This large number of expected responses should enable us to estimate average outcomes and differences across organizational subgroups (including organization type, urban/rural area, or populations served) with a high level of precision.


Online Survey of Consumers

Statistical methodology for stratification and sample selection

As noted briefly above, Knowledge Panel draws a sample of consumers that is representative of the United States using a PPS weighted approach. A customized stratified random sample, with strata based on a RAND-developed C2C saturation variable at the ZIP codes level (using C2C product orders), will be drawn to ensure representation of individuals across regions representing a wide range of potential exposures. We may oversample individuals in high and low saturation strata to ensure that we survey both individuals who are likely to have and have not been exposed to C2C materials.


Degree of Accuracy Needed

Our cross-sectional survey of consumers aims to explore how potential exposure to C2C materials and information (as measured by the C2C saturation of the ZIP code of their residence) relates to consumer health insurance literacy, knowledge around accessing care, and care-seeking behaviors. To explore these issues, our two primary consumer outcomes of interest will be measures of health insurance literacy, and use of preventive care services. To determine necessary sample sizes for this survey, we consider comparison of outcomes for the group of respondents who live in a high-C2C saturation area to the group of respondents who live in low-C2C saturation areas. For example, by splitting high and low areas at the median level of zip code saturation.


With this framework, we estimated minimum detectable effect sizes for our two key outcomes. Our measure of health insurance literacy is drawn from Paez et al.’s (2014) Health Insurance Literacy Measure (HILM), which is a 21-item self-administered scale with 4 subscales measuring consumer confidence in choosing and using insurance.3 The HILM subscales have scores that range from 0 to 100, with higher scores corresponding to higher health insurance literacy. In their sample of 828 nationally representative Americans aged 18-64, Paez et al. (2014) reported mean values across all subscales that were approximately 50, with a standard deviation of roughly 25. For the outcome of preventive care utilization, there are a variety of possible indicators including attendance at a preventive visit, receiving a flu shot, blood pressure reading, various screening tests e.g., mammogram, pap test, breast exam, PSA test), and having a usual source of care.4 Anticipated baseline rates for possible utilization rates range from a low of 28% for having a usual source of care (USC; MEPS 20125), to as high as 76% for having a recent mammogram (NHIS)6. Since indicator variables with 0/1 outcomes have maximum variance at 50%, we conservatively estimate minimum between-group differences detectable with 80% power when the base rate is 50%.


The table below shows our minimum detectable effect sizes for both outcomes if we survey a total of 2800 persons, or 1400 each in the high and low C2C exposure groups. Given the results in Exhibit B.2.1 above, our calculations demonstrate that a planned sample size of 2800 respondents will provide sufficient power to detect meaningful differences between high and low-C2C saturation groups for both primary outcomes of interest. For example, with 1400 per group, we would have 80% power to detect a 5.35 percentage point difference in utilization with uptake of 50% among respondents residing in a high-exposure C2C area, based on a two-sided t-test with a 0.05 alpha-level, or Type I error rate (e.g., between rows 4-5 of Exhibit B.2.1). Rows 1 and 2 of Exhibit B.2.1 show that we should be able to detect a mean difference that is between 0.1 and 0.11 standard deviations in health insurance literacy. These are generally considered to be very small effect sizes, but should still be of practical significance. Sample size calculations based on a dichotomous measure of C2C exposure provide a simple demonstration of detectable differences. In fact, we will develop a continuous measure of C2C saturation that will enable more sophisticated, and potentially more powerful, analyses of the effect of C2C saturation on outcomes, such as examining dose-response models. Knowledge Networks panel members typically have response rates above 60% for cross-sectional surveys, which implies that we should plan to invite a total of roughly 4667 respondents to participate in the survey or until our desired sample size of 2800 completed responses is reached.


Exhibit B.2.1 Outcomes of Interest and Sample Size Calculations for Online Survey



Outcome of Interest

Mean

SD

Effect Size

N per arm

1

HILM

50

25

0.10 SD*

1570

2

HILM

50

25

0.11 SD

1298

3

Utilization (0/1)

0.50

n/a

5 pp

1605

4

Utilization (0/1)

0.50

n/a

5.35 pp

1404

5

Utilization (0/1)

0.50

n/a

6 pp

1119

Notes: All results are calculated for achieving 80% power at the 5% significance level. The number invited per arm column at right should be multiplied by 2 for total number of invitees. For health literacy outcomes we perform simple pointwise comparisons; for utilization outcomes we presume that all respondents are eligible for preventive care.

SD=standard deviation





Case Study: Semi-structured interviews of key staff from organizations


Statistical methodology for stratification and sample selection

The semi-structured interviews are designed to explore, in greater depth, the issues raised in the web-based survey in order to provide more insight into specific topics specifically for communities using C2C materials in languages other than English. To ensure that a range of perspectives is represented, we will undertake purposive sampling of organizations within the identified community, if there are more than five. If there are five or fewer organizations that have placed C2C product orders, we will recruit all five.


If warranted, the general approach to sampling for the semi-structured interviews will be to sample within organization subgroups, where we expect to define the subgroups based on the following characteristics listed in order of priority:

  • Type of organization (e.g., clinic, navigator program, faith-based organization)

  • Size of organization


Since little is known about the organizations placing product orders beyond their name and address, the team will conduct a scan of the organizations that placed non-English product orders in the case study community to identify characteristics needed for sampling. This may include a review of the organization’s website and other publicly available information. For those organizations selected for an interview, the interviewers will contact the person who completed the online survey by email to schedule the interview (see Appendix H for recruitment materials). Because individuals may have different roles with respect to C2C within their organization (e.g., training of staff, use with consumers), we will interview up to one additional staff member as well. The primary study participant will identify these individuals.


Degree of accuracy needed

The qualitative interviews will allow for clarification of ideas and add dimension and context for the quantitative data collected in the online surveys, particularly as they relate to communities where English is not the primary language. Given that these individuals will all be drawn from the same community, we believe that the number of proposed interviews (between five and ten) will be sufficient for theme saturation and convergence.


Case Study: Focus group of consumers


Statistical methodology for stratification and sample selection

We plan to allow for multiple recruitment strategies to ensure we locate sufficient numbers of consumers fluent in the identified language. This may include working with organizations that track individuals and can grant us permission to contact them about the study. Sign-up sheets at events or during open enrollment that are in the identified language may also be used where consumers can indicate a willingness to be contacted. We plan to compensate community organizations that assist us with the focus groups through recruitment or other logistics planning such as offering their location or staff to support this effort. We may also use more general advertising (e.g. listserv, flyers, social media within relevant organizations) if the previous methods do not generate enough willing participants within the timeframe of the study.


Degree of accuracy needed

This focus group is designed to be descriptive and exploratory in nature. As such, the anticipated sample size of eight to twelve consumers is sufficient.



B.2. Procedures for Collection of Information


In this section, we describe the data collection procedures for the online surveys and the site visit. We also discuss relevant estimation procedures. Since this is a one-time data collection, the use of periodic data collection cycles is not applicable.


B.2.1. Data Collection Procedure


Online Survey of Organizations. For the survey of organizations, the primary mode of data collection will be a web-based survey administered using DatStat, an online data collection platform. (The survey instruments are provided in Appendix A). As noted in Section B.1, all individuals that placed product orders will receive an email with an invitation to participate in the survey (see Appendix H for the recruitment materials). The email will include a clear rationale for the study and explain how the individual (or another representative from the organization better suited to respond to questions about C2C) can contribute to the survey effort. The email will also include a letter of support from CMS to further encourage participation (see Appendix I for the proposed letter to be signed by the Director of the CMS Office of Minority Health). Contact information will be provided for those who have questions prior to agreeing to participate. If an individual does not respond to the email invitation to participate with an alternate name, has not initiated the survey, or has initiated, but not completed the survey within one week of receipt of the invitation, DatStat’s online system will be programmed to send a follow-up email. A second email will be sent two weeks after receipt of the initial invitation. Follow-up by phone can also be used for some organizations that do not respond to the email prompts and will begin three weeks after receipt of the initial invitation, with priority being given to programs serving high-priority populations (see Appendix H for phone recruitment script). A final email will be sent between six and eight weeks after the initial invitation was sent. The survey will remain open for three months.


Respondents using the DatStat interface will be able to begin the survey, save responses, and return later to the instrument if they are not able to complete the survey in one session. Those who are not able to complete the survey using one of the available electronic methods (e.g., Internet, personal digital assistance, smart phone) will be offered the opportunity to conduct the survey over the telephone with a trained interviewer.


Online Survey of Consumers. Data collection for the online survey of consumers will be conducted by Knowledge Networks using their standard procedures for fielding surveys. Once panel members are recruited, they are invited to complete surveys (see Appendix H for sample KnowledgePanel recruitment materials). Participants are invited randomly within strata relevant to the study. When a participant is assigned to a study, they are notified by email that a survey is available to them. Each notification contains a password-protected link that sends them to the survey, which can only be used one time. All that the respondent needs is the link – the panel does not require respondents to complete demographic or other background information each time they login, reducing participant burden. The participant is only required to complete their demographic profile one time when they join the panel. Participants are also able to complete the survey at their convenience, and are able to refuse any survey they wish. After three days, automatic email reminders are sent to nonresponding panel members in the sample. If email reminders do not produce a sufficient response, an automated telephone call can be initiated. Telephone calls only follow reminder emails and follow three to four days after the reminder email. As soon as survey strata reach quota, the survey is marked as complete. The survey stays open as long as needed to reach the desired sample size. Most surveys reach their intended sample size within several days.


Case Studies: Semi-structured interviews of key staff from organizations. The case study will be conducted in-person over a three-day period. As with the online survey, those organizations selected for the semi-structured interviews will receive an email invitation to participate in the study (see recruitment email in Appendix H) that is accompanied by a letter of support from the Director of the CMS Office of Minority Health for the study (Appendix I). The email will explain that a member of the study team will call them to discuss their involvement in the study, through participation in a semi-structured interview and support recruiting for the consumer focus group, and answer any questions the organization may have. All correspondence will be conducted in the preferred language of the organization—RAND has staff fluent in all six of the potential languages for the case study. Once the organization has agreed to participate, RAND staff will work with selected organizations to schedule the interviews with key staff, and discuss recruitment and planning for the consumer focus group. While most of the interviews will be conducted in-person, we may conduct a limited number by phone if individuals are not available at the time of the case study. The interviews will be conducted by a local professional facilitator, fluent in the preferred language of the community. RAND staff, also fluent in the preferred language, will be present for quality assurance and to ensure smooth execution of the case study data collection efforts. The interview protocol and focus group discussion guide are provided in Appendices C and D. All interviews will be audiotaped, with the permission of the participants, to ensure that the interviewers are adhering to the protocol and standards for qualitative interviewing.


Case Studies: Focus group of consumers. As noted above, we plan to allow for multiple recruitment strategies to ensure we locate sufficient numbers of consumers fluent in the identified language. This may include working with organizations that track individuals and can grant us permission to contact them about the study. Sign-up sheets at events or during open enrollment that are in the identified language may also be used where consumers can indicate a willingness to be contacted. We may also use more general advertising (e.g. listserv, flyers, social media within relevant organizations) if the previous methods do not generate enough willing participants within the timeframe of the study. Identified consumers will receive a letter or an email invitation to participate in the focus group, depending on their preference and the contact information provided when they expressed their initial interest. The invitation will explain the purpose of the study, include logistical information about the focus group (e.g., time, location) and note that they will be compensated for their time. We will work with the case study organizations to select an ideal location for the focus group, which is easily accessible and familiar to the community (e.g., community organization, library). Rather than one larger focus group, we may conduct two smaller groups of 5 or 6 individuals each, if needed, to accommodate schedules and allow for an assessment of similarities across group responses.


Individuals will be asked to complete a brief, anonymous, one-page form as they come in, which will allow us to characterize the focus group sample (see Appendix D). The focus group will be conducted by a local professional facilitator, fluent in the preferred language of the community. RAND staff, also fluent in the preferred language, will be present for quality assurance and to ensure smooth execution of the case study data collection efforts. Individuals will be read a consent form prior to the start of the focus group, and the focus group will be audiorecorded to ensure all relevant points are captured (the consent form can be found in Appendix D). The focus group will last approximately 60 minutes, and individuals will be compensated for their time at an amount determined post-site selection, in order to take into account regional variation.


B.2.2. Estimation Procedure


As discussed in Section A.16, consumer survey data will be analyzed using standard quantitative analysis methods. If there is any nonresponse, we will employ appropriate statistical procedures in our analysis to correct for any potential nonresponse bias. Those procedures involve reweighting the observed cases to account for nonresponse. The nonresponse weights will account for known characteristics of the missing cases based on information available from Knowledge Networks. For the organization survey, we will compare C2C product ordering histories of responding versus non-responding organizations to explore potential non-response bias. We will address our research questions using standard survey data methods, including methods to adjust for non-response bias, as needed.


Section A.16 also describes the use of qualitative methods to analyze semi-structured interviews and focus group data. As such, the construction of analytic weights is not relevant for that portion of the analysis.


B.3. Methods to Maximize Response Rates and Deal with Nonresponse


We will employ a number of strategies to maximize response for each of the data collection elements. First, we will use clear and easy-to-read materials to explain the study to survey, semi-structured interview, and case study respondents. The invitation to organizations to participate in the study will begin with an email that provides the motivation for the study and highlights the importance of participation in the study (see Appendix K for recruitment materials). A letter from the CMS Office of Minority Health will accompany the invitation to further encourage participation. These plans are reflected in the recruitment materials provided in Appendix H.


Second, both DatStat (for organization survey) and Knowledge Networks (for consumer survey) enable users to complete the survey using a variety of devices, which will facilitate higher response rates.


Third, case study semi-structured interviews will be conducted in the preferred language of the respondent and offered in person or by phone; strategies that will support higher response rates. Similarly, the focus group will be conducted in the preferred language of the community, and we may break the focus group into two smaller groups allowing participants to select the date and time that best fits their schedule (e.g., one during the day and one in the evening).


Fourth, participants will be compensated for their time. Individuals participating in the online survey of organizations will receive a $25 gift card. Organizations that participate in the case study will receive compensation to offset space-related, staffing, or other resource costs. Key staff and consumers who participate will receive gift cards for their time. The specific amount of the gift card will be determined after we have selected the location of the case study. This will ensure that the incentive is reasonable relative to the cost of living in that region but is not too high as to be considered coercive. Consumers who participate in the online survey through Knowledge Networks will be compensated through that panel. The KnowledgePanel incentivizes participants through a point system, where participants receive points for each survey they complete. Points can be redeemed for cash, merchandise, gift cards, or game entries. Additionally, members may also be entered into special sweepstakes with both cash rewards and other prizes.


Despite encouraging participation through these approaches, we do anticipate some nonresponse. Both DatStat and Knowledge Networks assign a unique ID to participants, which can be used to track, in real time, who has responded to the survey. For those who do not initially respond, we will send follow-up emails, followed by a phone call at regular time points. In reporting our results, we will calculate nonresponse rates according to the standards promulgated by the American Association for Public Opinion Research. According to this standard, the response rate will be calculated as the ratio of the number of completed cases to the number of eligible cases.




B.4. Test of Procedures or Methods to be Undertaken


As discussed in Section A.12, pre-tests were conducted with three potential respondents to the organization survey and three potential respondents to the semi-structured interviews that will occur as part of the case studies. All respondents were enthusiastic about the planned survey and thought the resulting information would be useful for the CMS Office of Minority Health. Throughout this process, we refined the survey questionnaires and the semi-structured interview protocols. The respondent burden was also estimated from these tests.


The consumer survey is comprised largely of existing, validated questions, drawn from the literature. As such, no pre-tests were conducted on this survey. Respondent burden was calculated from publicly available information on the time it takes to complete these items.


In addition, DatStat has been collecting online data for 15 years and its online data collection platform has been successfully employed in numerous other studies conducted by RAND and other research organizations. The use of DatStat for internet interviewing reduces respondent burden relative to an in-person interview, while the use of the online software allows for efficient survey programming and administration.


Knowledge Networks is a leading research firm supporting projects in public policy, non-profit, and academic organizations. Knowledge Networks was acquired by the GfK group in 2012, which has offices in more than 100 countries around the world. Numerous Human Subject Review Committees have reviewed and approved the panel’s survey procedures (e.g. Cornell University, Harvard University, Princeton University, Rutgers University, and University of Maryland, among others). Previous studies using the panel have also been reviewed and approved by the U.S. Office of Management and Budget. The panel offers respondents a means of completing the survey with minimal burden, provides participants with web-enabled computer and free Internet service when necessary to increase accessibility, relies on the principle of voluntariness and informed-consent, and does not removed participants from the panel if they fail to complete a certain research project.


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


The data for this study is being collected by the RAND Corporation on behalf of the Centers for Medicare & Medicaid Services, Office of Minority Health. With CMS oversight, RAND is responsible for the study design, data collection, analysis, and report preparation. Key input to the statistical aspects of the design was received from the following individuals:


Tom Concannon, Project Director;

Laurie Martin, Co-Project Director;

Kathryn Bouskill, Behavioral Scientist;

Jill Luoto, Economist;

Elizabeth Petrun Sayers, Behavioral Scientist;

Carolyn Rutter, Statistician; and

Vivian Towe, Policy Researcher.

CMS OMH Staff, including Director Cara James, Michelle Oswald, Ashley Peddicord-Austin, and COR Scott Yeager, have overseen the design process.


1 Those participants who do not have access to the Internet are provided a web-enabled computer and free Internet service so that they can also participate as panel members.

2 GfK. (2013) Knowledge Panel® Design Summary. http://www.knowledgenetworks.com/knpanel/docs/knowledgepanel(R)-design-summary-description.pdf

3 Paez, Kathryn, Coretta J. Mallery, HarmoniJoie Noel, Christopher Pugliese, Veronica E. McSorley, Jennifer L. Lucado, and Deepa Ganachari, “Development of the Health Insurance Literacy Measure (HILM): Conceptualizing and Measuring Consumer Ability to Choose and Use Private Health Insurance,” Journal of Health Communication, Vol. 19, Supp. 2, 2014, pp. 225–239.

4 Agency for Healthcare Research and Quality, Medical Expenditure Panel Survey: Preventative Care, database, 2003–2011.

5 Davis, Karen E., “Access to Health Care of Adult Men and Women, Ages 18-64, 2012,” Statistical Brief #461, November 2014.

6 Shi, Leiyu, Lydie A. Lebrun, Jinsheng Zhu, Jenna Tsai, “Cancer Screening Among Racial/Ethnic and Insurance Groups in the United States: A Comparison of Disparities in 2000 and 2008,” Journal of Health Care of the Poor and Underserved, Vol. 22, No. 3, August 2011, pp. 945–961.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEmily Hoch
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy