Revised January 26, 2010
New collection entitled: Schools and Libraries Universal Service Support Program (“E-rate”) Broadband Survey
B. Collections of Information Employing Statistical Methods:
The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When item 17 on the Form OMB 83-I is checked, “Yes,” the following documentation should be included in the Supporting Statement to the extent that it applies to the methods proposed:
1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
There are approximately 25,000 E-rate applicants under the E-rate program. This survey will sample 5,000 of those E-rate applicants. Specifically, this survey will focus solely on E-rate applicants and not all schools and libraries across the United States. The information from this survey will help determine how to best address the educational and technological needs of schools and libraries under the E-rate program as part of the Broadband Plan, as well as, help the Commission to make future policy decisions for the E-rate program.
We anticipate that we will receive an initial response rate of 50 percent to the email survey for a total of approximately 2,500 initial responses from a sample of 5,000 E-rate applicants. This response rate will be the result of one follow up reminder emails. It is anticipated that subsequent follow up, including a second and third email reminder and two telephone reminder calls will boost the overall response rate to over 70 percent (for a total of 3,500 responses). This is the first time we have conducted this survey, however, we believe that, due to the upfront efforts that will be made to alert E-rate applicants of the upcoming survey, we will receive at least an initial 50 percent response rate. Additionally, E-rate applicants will have just completed their funding year 2010 applications and, thus, will generally either have the necessary information to complete the survey close by or will know it without the need to refer to their files. We estimate that the overall response rate will be over 70 percent based on GAO’s recent survey concerning the E-rate program which received a response rate of 78 percent. See http://www.gao.gov/products/GAO-09-254SP. GAO’s sample was drawn from about 31,000 applications, and based on this sample, GAO sent questionnaires to a total of 697 individuals. In this instance, we are planning to send the survey out to 5,000 applicants and anticipate that, with the follow up reminders over a six week time period, as detailed below, the response rate will be boosted to over 70 percent, similar to the response rate for GAO’s survey.
The sample of 5,000 applicants will be selected from the contact listing provided by the Universal Service Administrative Company (USAC), the Administrator of the E-rate program, which contains about 23,000 E-rate applicants and includes both schools and libraries. From USAC’s contact listing, Harris Interactive, Inc. (Harris), an independent national marketing research firm contracted by the FCC to prepare and conduct the survey, will compile a representative sample of the E-rate population, including rural and urban schools, and by applicant type (school, library, and consortium). The survey will be sent out those individuals, such as the school superintendent or library director, IT director or coordinator, school principal, etc., who are responsible for the E-rate program for their school, school district or library, and are designated as the authorized person to complete the required FCC Forms for the E-rate program. The contact listing provided to us by USAC includes the name, title, address, phone number, fax number, and email address for each individual respondent. The contact listing is very accurate because the schools use these contacts to obtain benefits from the E-rate program, and therefore the schools have incentive to ensure that the contacts listed are current and correct. The survey is being hosted by Harris who will provide a link to access the survey via email to each individual respondent.
The estimated timeline for fielding is as follows:
Day 1 – Send out survey invitations to randomly selected E-rate applicants (potential survey participants).
Day 8 – Send out first reminder email to all potential survey participants who have not completed the survey to date.
Days 9-12 – Telephone reminder calls to all potential survey participants who have not completed the survey to date.
Day 19 – Send out second reminder email to all potential survey participants who have not completed the survey to date.
Days 22-25 – Second telephone reminder call to all potential survey participants who have not completed the survey to date.
Day 32 – Send out third and final reminder email to all potential survey participants who have not completed the survey to date.
Day 39 – Close field on E-rate survey.
Harris will note in the survey invitation that the respondent’s responses will not be released beyond the study team and the findings will be reported in aggregate only. Specifically, Harris will note that the E-rate applicant’s name, and their school’s, school district’s or library’s information will not appear in any written reports. Additionally, their name and their school’s, school district’s or library’s information will not be associated with any response or comment they choose to make about the E-rate program. Harris will also let applicants who receive the survey know that we obtained their email addresses from USAC for the sole purpose of this study and that they can unsubscribe and not receive further mailings. Harris will also provide information on how to comment or express any concerns about the survey. In order to ensure an initial 50 percent response rate, E-rate applicants will be given notice via a weekly newsletter sent out by USAC of the upcoming survey prior to distribution and Harris will send an initial invitation to participate in the survey and a minimum of two follow-up emails during the course of survey fielding to individuals who have not completed the survey. In addition to the follow-up emails, Harris will also place two short follow-up phone calls to applicants who received but have yet to complete the survey and urge them to complete the survey. We believe these follow-up calls will be successful in improving the response rate to over 70 percent because we have phone contact information for over 97 percent of the E-rate applicants in the USAC list.
In order to address non-response bias should we not achieve the 80 percent response rate as required by OMB’s standards, as a first method of analysis, Harris will compare the survey respondents to the overall population (that is, individuals in key demographic groups, such as urban/rural locations, schools/libraries, etc.). Specifically, Harris will review the demographic data in the USAC list and other known reported data, such as the NCES and other data used by the GAO, identify respondent categories that are over and under-represented and statistically weight the data accordingly.
In addition to comparing the survey respondents to the known reported data, Harris will also undertake a telephone survey of non-respondents, defined as applicants who did not respond to the online survey. Harris will conduct this survey via phone in an effort to use a different methodology than used for the original survey and will have a shorter survey instrument designed to collect answers to key questions and demographic information about the non-respondents. The results of this survey will be compared to the online survey, respondent categories that are over and under-represented will be identified and the data will be statistically weighted accordingly.
The non-respondents will be selected randomly from the initial sample list. A total of 500 non-respondents will be selected in order to complete 100 interviews via telephone with non-respondents. We will field this telephone survey after completion of the online survey and expect the interviewing to take approximately two weeks (10 business/school days). Key questions (drawn from the originally fielded survey) are provided on Attachment A. Answers from these survey questions will be compared to the answered gathered through the original data collection. Differences between the two samples will be reviewed and tested for statistically significant differences at a 95 percent confidence level. Where differences exist on usage or needs oriented questions, a review of the demographic data will take place to identify to see if the demographics of the non respondents may have contributed to the difference. These differences will be accounted for in the overall weighting schema applied to the final data.
2. Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The survey will be conducted via the Internet. The average expected response time will be approximately 20 minutes. In order to arrive at the response time of 20 minutes, Harris used Harris’ Calculating Total Survey Duration on the Internet guidelines and timed the longest possible survey where a respondent would answer every question, and the total response time was 20 minutes. Moreover, to reduce the burden of this information collection, we plan to issue this survey once.
3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.
The survey will be conducted in accordance with best practices of the survey industry. Harris has been contracted to draft and conduct the survey and follow up with applicants, if need be, to maximize the response rate. Harris’s follow-up efforts will include sending a total of three e-mail reminders as well as placing two follow-up phone calls with non-respondents during the fielding period. Harris will also conduct a telephone survey of applicants who did not respond to the online survey. This is detailed above.
4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.
The survey was internally tested by Harris and by approximately ten FCC staff and three USAC staff to ensure that the program is operating properly. Specifically, this was done to check for any deficiencies within the survey and to review the questions and answers as they will appear to the respondents prior to final distribution to the full target population. Harris , FCC staff, and USAC staff pre-tested the survey in a variety of combinations by taking on the role of a school, library, or consortium, and answering the questions in different ways to ensure that the survey will minimize the burden on respondents and improve utility. In addition, the survey was pre-tested to approximately nine respondents. Based upon Harris’s review, FCC staff’s review, and USAC’s review, several program changes and minor changes to the questions and answers were made to prompt more specific, accurate, and complete responses from the respondents.
5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
Regina Brown Gina Spade
Attorney-Advisor Assistant Division Chief
Telecommunications Access Policy Division Telecommunications Access Policy Division
Wireline Competition Bureau Wireline Competition Bureau
202-418-0792 202-207-5025
File Type | application/msword |
File Title | October 2009 |
Author | Judith Herman |
Last Modified By | Judith Herman |
File Modified | 2010-01-26 |
File Created | 2010-01-26 |