Cognitive Interview Results

telephone_survey pretest_brief update 02_12_08 vOMBdoc (2).doc

The Study of Free Access to Computers and the Internet in Public Libraries

Cognitive Interview Results

OMB: 3137-0078

Document [doc]
Download: doc | pdf

MEMORANDUM


Date: February 8, 2009

To: Mike Crandall

Karen Fisher

From: Samantha Becker

Re: Telephone survey pretest #1

The following is a brief summary of the status of telephone survey pretesting activities. The first section will review procedures used to conduct the first pretest; the second section contains observations about general trouble areas within the survey and offers possible solutions. Notes in italics indicate action ultimately taken. Once further analysis of pretesting results are complete, individual questions will be reviewed, the survey will be revised, and then will be retested in field conditions using behavior coding.

Overall, the testing revealed no major problems with the survey instrument. Respondents were able to successfully differentiate between PAC and OPAC and the domain screening questions were successful in funneling respondents to relevant survey segments (i.e., no one was funneled to a domain for which they did not use PAC and no one skipped a domain for which they did use PAC). Remaining issues, if any, will be better uncovered in field conditions during the 2nd pretest.

Procedures

Pretesting for the IMLS IMPACT project telephone survey was conducted at The Seattle Public Library on February 5, 2009. Four interviewers (2 male and 2 female) recruited nine subjects as they entered the Central Branch library from 10:00 a.m. until 4:00 p.m. Subjects ranged in age from 57- to 25-years-old; 7 were male. Interviewers were provided with a telephone survey script and a cognitive debriefing script and were asked to practice before the testing session.

Subjects were screened for public access computer use prior to recruitment and were offered $20 for participation once their user status was ascertained. Interviews were conducted in meeting rooms inside the library and ranged in length from 21 minutes to 88 minutes, during which interviewers read the telephone survey and then followed up with debriefing questions.

All 8 domains were tested, with respondents answering questions related to an average of 5.3 domains. Each domain was tested an average of 5.875 times, with eBUSINESS receiving the fewest tests (1) and EDUCATION and EMPLOYMENT receiving the greatest number (8).

Interviews were recorded and the tapes used to complete a matrix of responses to the actual survey questions, as well as answers to the debriefing questions. The results of this analysis will be used to refine the wording and flow on specific questions in the survey instrument. General observations from the respondents and interviewers and subsequent actions taken are discussed below. In addition to these actions, the household poverty tables were updated with 2008 figures and the “last birthday” randomization procedure was added to the qualifying questions.

General observations

Time frame. The survey is meant to capture types of use during the previous 12 month period. All four interviewers used multiple terms to describe the time frame, including in the past year and in the past 12 months. All interviewers expanded the time frame by inserting the word ever at multiple points during the interview, as in have you ever used the public library computers or Internet connections for… On occasion, the questions was phrased In the past year have you ever… This seemed to be an unintentional natural language problem. Some respondents also reported referencing different time frames and, in one case, the past year may have been interpreted as starting on January 1st. Also, the distinction between every day and most days, but not every day was lost between the interviewers, with some interpreting M-F activities as every day, while others considered this most days.

Possible solution: Rework all questions to favor the previous 12 months and remind interviewers to avoid using the term ever. This should be carefully observed during the second pretest. Merge every day and most days to 5-7 times a week.

Reminders of time frame were inserted throughout the survey. The placement and exact wording of the statement varies from question to question in order to reduce participant fatigue. For this reason “in the past year” is used in some cases, however it is defined at the past 12 months in the introductory statements and most CI respondents thought of “past year” as past 12 months from today. The everyday/most days categories were collapsed. TCI interviewers will be informed that habitual use of the word “ever” needs to be monitored by interviewers.


Structural orientation. Two of the interviewers added explanatory or orienting statements to the script in order to orient the respondent to the structure of the survey. Respondents in this group may have had an easier time following the survey.

Possible solution: Add orienting statements at transitions between sections and at the transition to detailed questions after domain screening questions.

Orienting statements were added at the beginning of the domain specific and demographic sections.

Loss of library focus. Most respondents at some point seemed to get confused about the subject of the survey and may have referenced activities done using other computers.

Possible solution: Add additional orienting questions. Explain at beginning of domain questions that we only want things they had done on the library computers or Internet connections. Need to be careful about too much repetition.

A statement was added at the beginning of the domain questions reminding respondents that all answers should reference activities done using library computers or Internet connections.

Length of survey. The time to actually administer the survey ranged from a low of 13 minutes to a high of 48 minutes and averaged 22 minutes. Cell phone users may not be willing to spend so much time on the phone.

Possible solution: Reduce introductory script; move contact information (including website address) to end and offer to read it; reduce words in answer options in demographic section.

Reworded introductory script. Dropped incentive. Compressed demographic categories.

Household/family. Some respondents included unmarried partners in the household number. Also, U5 (children under 18 in household) came before the housing type question (Z3); one homeless respondent was confused by this question.


Possible solution: Review all questions that use the words household or family to ensure consistency of meaning; screen for homelessness earlier.


All references to family and household were reviewed for internal consistency. The meaning of household was introduced in the demographic section. There was no reasonable skip logic solution that would eliminate all household questions from homeless respondents, however we do not expect to reach significant numbers of homeless respondents through the telephone survey.


Open-ended questions. Interviewers prompted respondents at open-ended questions with examples from the script.


Possible solution: Provide prompting scripts and codes for answers. Instruct TCI on how open-ended questions should be handled. Add open-ended question about missing use-types at end of every domain.


Added some selections/prompts. More may be added following 2nd pretest.


Importance. Most of the respondents reported they felt the computers were very important, however on probing it was clear that most framed this question as a combination of personal importance and societal importance, with any difference accounted for by the presence or absence of alternative access points.


Possible solution: Split question to ask about importance to self and others.


Split these questions. Added question of other’s use asked of all respondents (users & nonusers).


Education. Respondents were not consistent in how they regarded education. At times, education was only considered institutional learning, or learning associated with work. At other times, independent learning for enjoyment or edification was included. The examples given at the domain screening level emphasize formal learning.


Possible solution: Clarify which type of education we are interested in at each section. Include informal learning at domain screening level. Add a question specific to informal learning.


Added learn about hobbies to social inclusion. Left education as institutional.

Telephone survey pretest #1 |4


File Typeapplication/msword
AuthorSamantha Becker
Last Modified Bycmanjarrez
File Modified2009-02-20
File Created2009-02-20

© 2024 OMB.report | Privacy Policy