T ITLE OF INFORMATION COLLECTION:
Omnibus Survey of NEA Knowledge and Attitudes
PURPOSE:
As part of an initiative to develop messages for media campaigns that different target audiences will find clear, attractive, interesting, and useful, the purpose of this information collection is to solicit opinions from individuals in the U.S. population regarding their familiarity and support for the NEA and the NEA's mission.
The omnibus 4-item survey is formative research in advance of a qualitative and quantitative research project for the NEA. The project goal is to understand what messages will most appeal to a variety of demographics, such as gender, age, education, race/ethnicity, and Census region (see Table 1 below). The omnibus research will identify what the American public knows and understands about the NEA.
This survey is designed to understand perceptions of the NEA among our target audience, with the goal of shaping Agency messaging to better resonate with this audience. The ultimate goal of these messaging changes is to increase the relevancy of the agency and its work to the American people , in order to effectively reach the goals and objectives defined in the NEA mission.
DESCRIPTION OF RESPONDENTS:
The target group for the omnibus questions is American adults aged 18 and older that are nationally representative based on gender, age, education, race/ethnicity, and Census region (see Table 1 below), and that are members of the Ipsos iSay panel and individuals intercepted online and asked to participate in the survey; Ipsos is the contracted research firm that the NEA has hired to conduct this research.
TYPE OF COLLECTION: (Check one)
[ ] Customer Comment Card/Complaint Form [ ] Customer Satisfaction Survey
[ ] Usability Testing (e.g., Website or Software) [ ] Small Discussion Group
[ ] Focus Group [X] Other: NEA awareness survey
CERTIFICATION:
I certify the following to be true:
The collection is voluntary.
The collection is low-burden for respondents and low-cost for the Federal Government.
The collection is non-controversial and does not raise issues of concern to other federal agencies.
The results are not intended to be disseminated to the public.
Information gathered will not be used for the purpose of substantially informing influential policy decisions.
The collection is targeted to the solicitation of opinions from respondents who have experience with the program or may have experience with the program in the future.
Name: Ellen Grantham
To assist review, please provide answers to the following question:
Personally Identifiable Information:
Is personally identifiable information (PII) collected? [X ] Yes [] No
In order to conduct demographic analysis, Ipsos routinely collects the following types of personally identifiable information from both iSay Panelists and respondents intercepted on the Internet:
Year and month of birth
Gender
Country of Residence
ZIP Code
Industry of Employment for self and household members
Educational level
Race/ethnicity
Household Income
This information is used to divide interviews into groups. It is not used to identify individuals for later contact or sales targeting.
If Yes, is the information that will be collected included in records that are subject to the Privacy Act of 1974? [ ] Yes [X ] No
If Applicable, has a System or Records Notice been published? [ ] Yes [ X] No
Gifts or Payments:
Is an incentive (e.g., money or reimbursement of expenses, token of appreciation) provided to participants? [X] Yes [ ] No
Those who participate in the study will be members of the Ipsos iSay internet panel. Members of the panel receive points for participating in surveys. Once a member has accumulated 1,000 points, the points can be redeemed for a $10 gift card. Participants in the proposed study would receive 15 points – the equivalent of $0.15. Those intercepted online are also awarded points, but toward a game they are playing or are entered into a prize drawing. Using their points, all participants are also invited to play a short game at the end of the survey to win additional prizes, such as getaways, electronics, and housewares (grills and espresso machines – see web link for more details: http://www.i-say.com/Footerlinks/PollPredictorRules/tabid/338/Default.aspx). The award of “points” is a standard and effective procedure for encouraging people to participate in online surveys. This point system has been used in studies for many government agencies, including the FDA, IRS, and Peace Corps.
BURDEN HOURS
Category of Respondent |
No. of Respondents |
Participation Time |
Burden |
US resident |
1000 |
2 minutes |
33.33 hours |
Totals |
--- |
--- |
33.33 Hours |
FEDERAL COST: The estimated annual cost to the Federal government is $3325.50. This is the final cost to the government and includes all contractor costs related to administration of and support of the Omnibus survey and support.
If you are conducting a focus group, survey, or plan to employ statistical methods, please provide answers to the following questions:
The selection of your targeted respondents
Do you have a customer list or something similar that defines the universe of potential respondents and do you have a sampling plan for selecting from this universe? [X ] Yes [ ] No
If the answer is yes, please provide a description of both below (or attach the sampling plan)? If the answer is no, please provide a description of how you plan to identify your potential group of respondents and how you will select them?
The NEA currently has a contract with Ipsos to conduct market research for the Agency. The contract was awarded on September 30, 2014, and will end January 31, 2015. We will be selecting respondents from the Ipsos iSay panel and through river sampling. The iSay panel is made up of 800,000 individuals across the U.S. aged 18 and older who volunteered to be a member of the panel. They agree to participate in research studies conducted by Ipsos in exchange for points toward gift cards. River sampling is conducted by intercepting respondents while online and asking them to participate in a survey. Video game sites are a common source of such respondents. This form of sampling provides additional respondents, particularly respondents who are less likely to respond to surveys and serve on panels, such as young adult males. River sampling “rounds out” the sample, providing a sample that is more representative of the U.S. than does a panel sample alone, which have higher percentages of women and older adults. For all river sampled respondents, we do a full security check, fraud detection, and stamp their computer IP address in case we need to passively re-contact them for any reason, such as inappropriate use.
The outgoing sample was structured so that, based upon the anticipated click-through rates1 of different demographic groups of panelists, the responding population will look (before screening) like the U.S. adult population as a whole. The sample balancing factors include age, gender, education, income, race/ethnicity, census region, and population density.
Ipsos maintains detailed information on the probability of response for each member of its panel, and the sample selections were made so that the responding population would correspond with the U.S. population as a whole. This approach, known as sample balancing, or controlled selection, draws a sample that reflects the demographic distribution of the population being studied, rather than taking a simple random sample of the Ipsos panel (since the panel itself is not constructed to mirror the demographic distribution of the United States, but rather to allow for individual surveys to reflect the U.S. demographic distribution). Variables used in selecting a balanced sample include age, gender, race, ethnicity, education, Census division, and city population density. The sampling algorithm also incorporates the expected click-through rates for the different groups being sampled, so that it is actually the responding sample, and not the sample to be contacted, that should match the target demographic population. In the case of sampling for this project, that target was the population of U.S. adults age 18 or older (so that the screened population would match the National Health Interview Survey [NHIS] targets).
An example of this balancing approach can be found in Table 1 below, as the anticipated click-through rates for respondents are evaluated for five different variables, and the outgoing sample proportions are then calculated so that the anticipated responding sample matches that of the general U.S. population age 18+. Panelists’ probabilities of selection are then adjusted based on which combination of the balancing variables they correspond to, so that a random selection of panelists can be made while ensuring that the sample of panelists contacted corresponds to the balanced outgoing sample targets.
Table 1. Example of Sample Balancing to U.S. General Population Targets
|
|
Anticipated Click-through Rate (%) |
Planned Outgo sample (%) |
Anticipated Responding sample (%) |
2012 U.S. General Population Age 18+2 |
Gender |
Male |
7 |
54.6 |
48.3 |
48.3 |
Female |
9 |
45.4 |
51.7 |
51.7 |
|
Age |
18-34 |
5.9 |
43.1 |
30.4 |
30.4 |
35-54 |
9 |
33.2 |
35.7 |
35.7 |
|
55+ |
12 |
23.7 |
33.9 |
33.9 |
|
Education |
HS grad or less |
7.9 |
46.8 |
43.2 |
43.2 |
Some college or associates degree |
7.2 |
34 |
28.6 |
28.6 |
|
College grad or more |
12.6 |
19.2 |
28.2 |
28.2 |
|
Race/Ethnicity |
Hispanic |
8.5 |
14.9 |
14.8 |
14.8 |
Non-Hispanic White only |
9.1 |
62.5 |
66.5 |
66.5 |
|
Non-Hispanic Black only |
7.2 |
14.1 |
11.5 |
11.5 |
|
Other |
7 |
8.4 |
7.3a |
7.3a |
|
Census Region |
Northeast |
9.1 |
16.5 |
18.3 |
18.3 |
Midwest |
9.2 |
19.4 |
21.4 |
21.4 |
|
South |
7.5 |
40.5 |
37.1 |
37.1 |
|
West |
8 |
23.6 |
23.2 |
23.2 |
Random variation in panelist behavior, differences between the responding sample and the general population, propensity to quit or refuse to consent to the survey varying across different demographic groups, and imperfect panel data on propensity to respond mean that, in spite of the sample balancing described above, the final sample may not match the target population (U.S. adults 18 and older). In that case, non-response weights are needed to adjust the sample data to that of the U.S. adult population. Ipsos has developed weighting targets based upon the weighted NHIS filtered for respondents who meet the screening characteristics.
Having defined the target population, we are able to generate frequencies of key variables for these data to determine the demographic distribution the U.S. adult population. These targets are used to develop the sample weights using a rim weighting (raking) method. Sample weights will be calculated to match the weighting targets defined using the NHIS.
Though the mechanics of rim weighting are complicated, the basic idea is straightforward. The process begins with one of the five variables (say, gender) and adjusts the respondent-level weights to bring the dataset into alignment with known targets. In this case, the known targets were the proportions shown in the table of NHIS estimates below. The algorithm then moves to a second variable (say, age) and adjusts weights to match the U.S. population proportions in the various education categories. The process proceeds iteratively until a stable set of weights is obtained that fits all five of the demographic variables to within some tolerable level of strain.
The approach described here uses five variables – gender, age, education, race/ethnicity, and census region. The results of the weighting will be checked by comparing the weighted sample proportions to the 2012 NHIS targets. The weighting scheme may also be evaluated based upon the variance of the weights, since greater variance in weights (to account for more disproportionate unweighted samples) may reduce the effective sample size too greatly.
Once data are weighted, we will conduct frequencies and crosstabs to explore knowledge of and attitudes toward the NEA. We will conduct t-tests, chi squares, and ANOVAs as appropriate to identify differences between demographic groups on each question. For example, we may examine the extent to which men and women differ on their familiarity with the NEA, or the extent to which those of differing ethnicities and races vary in their perceptions of the importance of the NEA’s mission.
Administration of the Instrument
How will you collect the information? (Check all that apply)
[X] Web-based or other forms of Social Media
[ ] Telephone
[ ] In-person
[ ] Other, Explain
Will interviewers or facilitators be used? [ ] Yes [X] No
Please make sure that all instruments, instructions, and scripts are submitted with the request.
TITLE OF INFORMATION COLLECTION: Provide the name of the collection that is the subject of the request. (e.g. Comment card for soliciting feedback on xxxx)
PURPOSE: Provide a brief description of the purpose of this collection and how it will be used. If this is part of a larger study or effort, please include this in your explanation.
DESCRIPTION OF RESPONDENTS: Provide a brief description of the targeted group or groups for this collection of information. These groups must have experience with the program.
TYPE OF COLLECTION: Check one box. If you are requesting approval of other instruments under the generic, you must complete a form for each instrument.
CERTIFICATION: Please read the certification carefully. If you incorrectly certify, the collection will be returned as improperly submitted or it will be disapproved.
Personally Identifiable Information: Provide answers to the questions.
Gifts or Payments: If you answer yes to the question, please describe the incentive and provide a justification for the amount.
BURDEN HOURS:
Category of Respondents: Identify who you expect the respondents to be in terms of the following categories: (1) Individuals or Households;(2) Private Sector; (3) State, local, or tribal governments; or (4) Federal Government. Only one type of respondent can be selected.
No. of Respondents: Provide an estimate of the Number of respondents.
Participation Time: Provide an estimate of the amount of time required for a respondent to participate (e.g. fill out a survey or participate in a focus group)
Burden: Provide the Annual burden hours: Multiply the Number of responses and the participation time and divide by 60.
FEDERAL COST: Provide an estimate of the annual cost to the Federal government.
If you are conducting a focus group, survey, or plan to employ statistical methods, please provide answers to the following questions:
The selection of your targeted respondents. Please provide a description of how you plan to identify your potential group of respondents and how you will select them. If the answer is yes, to the first question, you may provide the sampling plan in an attachment.
Administration of the Instrument: Identify how the information will be collected. More than one box may be checked. Indicate whether there will be interviewers (e.g. for surveys) or facilitators (e.g., for focus groups) used.
Please make sure that all instruments, instructions, and scripts are submitted with the request.
1 The click-through rate refers to the proportion of invited participants who follow the survey link provided in the invitation e-mail.
2 U.S. General Population demographic statistics taken from the U.S. Current Population Survey, March 2012 supplement, accessed through the U.S. Census Bureau’s DataFerrett utility.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | DOCUMENTATION FOR THE GENERIC CLEARANCE |
Author | 558022 |
File Modified | 0000-00-00 |
File Created | 2021-01-26 |