Memo

SIS Logo Tagline Testing Submission Statement 9.30.15.docx

Generic Clearance for Data User and Customer Evaluation Surveys

Memo

OMB: 0607-0760

Document [docx]
Download: docx | pdf

The Census Bureau plans to conduct additional survey work under the generic clearance for Data User Evaluation Surveys (OMB number 0607-0760).


Statistics in Schools (SIS) is a program of the U.S. Census Bureau designed to provide classroom activities and resources to enhance learning and teaching about data and statistics. The SIS website helps teachers engage students in math and social studies by utilizing real-life Census Bureau data.


The Census Bureau is preparing to launch an updated version of the Statistics in Schools program based on previously conducted online survey research among teachers. That research, conducted in late 2014, solicited feedback on SIS classroom activities. The contractor is currently working with a team of subject matter experts to revise the existing SIS activities and add new activities for several of the subjects, including two new subjects: high school geography and sociology. These efforts are scheduled to be completed by March 2016.


The Census Bureau’s SIS staff is also planning an online interactive survey to test potential logos and taglines among teachers. As part of the content revision process, the Census Bureau will test potential logo and tagline designs to inform branding decisions about the updated SIS website and program. These designs are intended to make the SIS program more appealing, relevant, and trustworthy to classroom teachers. While the SIS program will primarily reach teachers and their students through the website, there may also be printed materials such as handouts and pamphlets that use the revised SIS logo and branding.


In the survey, each teacher will review four potential logos as well as three potential taglines. The survey will use both open- and close-ended questions to allow teachers to provide qualitative feedback in addition to concrete, measurable quantitative answers.


Materials for testing:

The online survey will examine four potential logos and three potential taglines. The logos and taglines are specifically designed to be interchangeable; any of the given logos can potentially be paired with any of the taglines. As a result, we will test each set of branding items separately.


Logos for Testing (4)


Taglines for Testing (3)


Bring Learning to Life with Census Data


Census Data. Classroom Learning. Teacher Resources.



Classroom Learning Powered by Census Data






Sample design:

In November 2015, we will conduct online interviews with n=260 teachers nationwide who teach at varying grade levels (elementary, middle, and high school) and in varying school settings (public, private, parochial). By definition, all will teach either mathematics or social studies (including history, government/civics, geography, sociology etc.).


All teachers will see all four potential logos during the survey. The analysis will place a particular focus on the initial logo that each participant reviews, since each teacher’s first impression will not be influenced by their opinions of the other logos. So that adequate numbers of teachers see each logo first, we will use a quota system to ensure that enough teachers from each of the disciplines and levels of interest are included in the final sample.


Grade Level

Subject

# of 1st observations

Total (n-size)

Elementary

General classroom

13 for each logo

52

Middle School

Social Studies

13 for each logo

52

Math

13 for each logo

52

High School

Social Studies

13 for each logo

52

Math

13 for each logo

52



Total interviews =

260


For each logo, the number of teachers who will review that logo first is broadly similar to the number of participants we would have if we conducted one or two focus groups. However, this online methodology is more effective than in-person focus groups, since 1) the sample of teachers is not limited to a particular geographic area, 2) they will be able to go through the questions at their own pace at a convenient time, and 3) the online survey mitigates other potential issues that emerge from focus group dynamics. In addition, as the survey environment is anonymous, teachers may provide more candid feedback than in a focus group setting.


The findings will be indicative of a broad range of teachers, but will not produce statistically significant, externally valid measures representative of all teachers. Data analysis will not use inferential statistical testing procedures.


Teachers will be recruited for the online survey using Internet survey sample panels that are designed to target specific types of professionals (including teachers) in the United States. The panel vendor currently has over 2.1 million active panelists, including approximately 60,000 identified as teachers. Panelists are recruited using an invitation-only method by partnering with a range of brands and websites to offer invitations to users and participants in corporate loyalty programs. This convenience sample is appropriate for the analysis purposes and number of teachers involved. The contractor’s previous experience conducting quantitative research projects with teachers—including for the SIS program in 2014—indicates that sample panels are the most efficient way to recruit significant numbers of educators from across the country. Internet panel participants receive points for completing a study, which they can then exchange for vouchers and gift cards from a partner network.


Survey Experience for Teachers:

Teachers will initially be contacted from the sample vendor by email. The message will be in a similar format to other Internet-based surveys that the teachers have participated in:


Subject Line: Get Rewarded for Your Time - Study about Education

Email Body: Dear <First Name>,

Based on your e-Rewards(R) profile, you are invited to earn e-Rewards Currency for participating in a research survey. If you qualify and complete the survey:

Full reward amount: <XXX> in e-Rewards Currency

Full survey length: approximately 30 minutes

To complete the survey and earn e-Rewards Currency, simply click the link below, or copy the URL into your browser: < Survey URL >

We encourage you to respond quickly -- this e-Rewards invitation will be available only until a predetermined number of responses have been received. Please Note: you will only receive e-Rewards credit for taking the survey once.

Continue to check your inbox and your Member home page for future opportunities to earn e-Rewards Currency.

Sincerely,

The e-Rewards Team

Shape1


The survey instrument is divided into five sections as follows:

  1. Screening Questions

  2. Introduction / Warm-ups

  3. Logo Testing (x4, randomized order)

  4. Tagline Testing (x3, randomized order)

  5. Closing Questions / Demographics


We will use a series of screening questions to ensure that we interview the appropriate types of classroom teachers. This will also help ensure that we have a broad and diverse group of teachers that captures a variety of student grade levels and subjects. Teachers will be asked several questions about their experiences as a classroom teacher, largely for warm-up purposes.


For each of the four logos, teachers will provide open- and close-ended feedback, including what they like best, what they did not like, and whether the material was subject- and age-appropriate. The order the teachers see the individual logos will be randomized to mitigate order bias. To reduce respondent fatigue, teachers will see a somewhat abbreviated set of questions for the second, third, and fourth logo they review. After assessing all four individually, teachers will then assess the four logos together as a set.


The three taglines will be tested using a similar process, with teachers reviewing each of the taglines separately, and then providing an overall comparative ranking of the three taglines.


Finally, we will ask demographic questions that will enable us to compare teachers by various means, including classroom specific demographics.


The Census Bureau and contractor team will conduct pre-testing to address potential issues with the questionnaire/online survey before it is launched. The pre-testing process has three objectives: 1) ensure that the question logic is functioning as intended for the various levels and types of teachers that will take the survey, 2) confirm that the question wording is clear and that teachers will be able to understand and properly respond to the questions, and 3) verify that the survey is compatible with a variety of Internet browsers and devices. This process will identify potential “red flags” that might affect the validity of the survey instrument and allow the research team the opportunity to improve the survey before fielding.

There are four phases to the pre-testing process:


Questionnaire drafting: The draft survey instrument has been reviewed by the contractor team (including a former teacher) as well as the Census Bureau’s SIS team.


Scripting and programing: In November, while the OMB package is being reviewed, the contractor’s in-house survey developers will program the online survey. Team Reingold member Penn Schoen Berland’s Internet Survey Group (ISG) was an early pioneer in online survey research and has conducted over six million interviews online. Team Reingold will also test the survey components using a variety of devices and browsers to identify and address any compatibility issues.


Pre-testing: After the survey is programmed, the survey instrument will be reviewed by a three-person contractor team, which will include a former teacher and one person who will not have been previously involved with the project. In addition, the Census Bureau staff will be provided with a link to review the survey tool and provide input. Any recommended changes will be documented and OMB will be notified prior to fielding.


Soft-launch and monitoring: The survey will initially be fielded with a small number of interviews (less than 10% of the total). The contractor will review these cases to check for any issues with the initial results before releasing the study to the full sample. During fielding, the contractor will systematically monitor data collection to identify ways to reduce burden, streamline processing, and assure data quality.


We estimate respondent burden at 25 minutes per survey with 260 interviews for a respondent burden of 109 hours to complete this survey. In addition, we estimate that 1,500 individuals will participate in screening questions for approximately 1 minute per screener, for an additional 25 hours of respondent burden. The total burden-hours for the study will be 134 burden-hours. Possible reasons participants may not qualify for the survey include: not being a classroom teacher (i.e., school principals, college professors, or education consultants); not teaching the subjects in which we are interested (i.e., not a math or social studies teacher); or we have already reached the desired quota number for interviews in a particular subject and/or grade level.


Project Timeline:

Nov. 2

Census Bureau submits OMB submission, contractor programs survey

Nov. 10

Team Reingold shares pre-testing survey with SIS team

Nov. 16

OMB approval of research

Nov. 17

Team Reingold begins interviews

Nov. 27

Team Reingold completes interviews

Dec. 18

Team Reingold submits draft research findings

Jan. 4

Census provides input on draft research findings

Jan. 11

Team Reingold provides final research report

Jan. 18

Team Reingold presents research findings

For further information, contact Monica Vines at 301.763.8813 or [email protected].


Attachment A: Statistics in Schools Logo and Tagline Questionnaire

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorvilky001
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy