M emorandum
Date: July 22, 2016
To: Ronald Hill, Calvin Johnson (HUD)
From: Rachel Gaddes, Brittany McGill (Insight Policy Research)
Subject: Pre-test Results for ConnectHome Internet Subscribers Telephone Survey
O
n July 14, 2016 and July 15, 2016 Insight conducted telephone interviews to pre-test the ConnectHome Internet Subscribers Telephone Survey (Phone Survey). This memorandum summarizes the results of the pre-test and our recommended changes to the instrument. The accompanying survey instrument includes tracked changes and comments explaining our recommended changes.
Pretest Methods
Four individuals participated in the pre-test for the Phone Survey. The pretest participants recruited through the Tampa PHA last winter were unable to be reached or were uninterested in participating; however, we were able to recruit additional participants through internal networking. Two pretest participants were Insight employees who were unfamiliar with the survey design, content, and layout, and hence were able to give us objective, third-party feedback. One participant lived in public housing and was not familiar with ConnectHome, and the other participant had previously lived in public housing, currently receives means-tested public assistance benefits, and was also not familiar with ConnectHome. The participants’ ages ranged from early-20s to early-30s; all four participants were women, and two had children. All four pretests were conducted by phone. In addition, we consulted with an in house survey design expert, Insight’s new Director of Data Collection Services, for further recommendations and guidance on implementing improvements following pretest participant feedback.
Average Survey Completion Time
The average length of the Phone Survey was 19 minutes. The average time for each section of the survey can be found in the table below:
Section |
Average time to complete |
A: Introductory phone script |
35 seconds |
B: Screener |
38 seconds |
C: Internet and technology access and use |
2 minutes 11 seconds |
D: Education |
3 minutes 26 seconds |
E: Employment |
4 minutes 26 seconds |
F: Health |
2 minutes 10 seconds |
G: Other uses |
2 minutes 24 seconds |
H: Technological and Internet Literacy |
1 minutes 11 seconds |
I: Demographics |
1 minutes 41 seconds |
J: Wrap up |
20 seconds |
Total |
19 minutes |
Results and Recommendations
After analyzing the results from the pre-test, we propose the following recommendations to improve the survey instrument:
Reduce reading level. Throughout the survey, we have suggested some changes to the wording to reduce the reading level and ensure participants can understand the questions.
Clarify language to improve comprehension. For all questions about online activities, we recommend adding language to ensure that the participant understands that the question is explicitly about activities conducted online rather than activities conducted in general. In the question about applying for health insurance online, we recommend taking out the phrase “through the federal or state health insurance exchange” to improve understanding and simplicity. Pre-test participants did not know what insurance exchanges were.
Add skip patterns. The demographics section included a question about the ages of all children living in the household. We suggest moving this question to the beginning of the survey allowing for the use of skip patterns on children’s education questions that are linked to their ages. For example, two of the pre-test participants had children who were all under age 5. They did not feel that the questions about looking for information about college or financial aid applied to their children.
Move adult education and training questions out of the education section and into a separate section following employment. Pre-test participants felt that the adult education and training questions were more related to employment than to education. Therefore, we recommend moving the adult education and training questions into their own section following the employment questions.
Balance frequency categories. For frequency categories throughout the survey, we recommend changing them to be less skewed to the high end. We propose the following new categories: every day, a few times a week, a few times in the past 30 days, not at all in the past 30 days.
Reinforce household representation. To ensure that participants answer the questions on behalf of every adult in their household, not just themselves, we recommend adding language to emphasize that the questions pertain to all adult members of the household.
Ensure emphasis on children when needed. Some participants answered the college and financial aid questions in the children’s education section on their own behalf. Therefore, we recommend adding “for the children in your household” to reinforce that this question is specifically asking about activities on behalf of the children in the household, not for adult respondents or other adults in the household.
Provide opportunity for “other” responses. For the battery about reasons for not using the internet at home, we recommend adding an “other” response option.
Revamp internet skills questions. To increase the variance in responses, we recommend replacing the confidence in general internet and computer skills questions with a battery of questions pertaining to confidence levels in various specific activities related to internet use.
Simplify age question. To prevent confusion, we recommend that the interviewer not read answer choices for the question about participants’ ages and instead simply ask “How old are you?” and enter the response into the appropriate category.
These recommended changes have been incorporated into the accompanying version of the survey.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Allyson Corbo |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |