Supporting Statement B_BIE Distance Learning HH Survey_CLEAN

Supporting Statement B_BIE Distance Learning HH Survey_CLEAN.docx

BIE Distance Learning Household Survey

OMB: 1076-0198

Document [docx]
Download: docx | pdf

Supporting Statement B

for paperwork reduction act submission


BIE Distance Learning Household Survey

OMB Control Number 1076-NEW



Collections of Information Employing Statistical Methods


The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When the question “Does this ICR contain surveys, censuses, or employ statistical methods?” is checked "Yes," the following documentation should be included in Supporting Statement B to the extent that it applies to the methods proposed:


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


An address-based frame will be developed of families with children enrolled in BIE-funded schools. Household directory information will be collected with the cooperation of BIE-funded schools. A census of BIE-funded schools will be undertaken because the universe of schools is small (N= 183 Bureau of Indian Education (BIE)-funded (both BIE-operated and tribally-operated) schools). We anticipate gaining cooperation from approximately 100 schools. A stratified sample of families across nine BIE regions will be selected among the cooperating schools and the families will be contacted using multiple survey modes for fielding the Distance Learning Household Survey. We estimate 9,246 families will be mailed the survey, which is the approximated total expected number of families that will be reachable in the approximately 100 schools that we anticipate will agree to participate. If more than 9,246 families have children in the approximate 100 schools, we will subsample the families using probability-based sampling methods to maintain a mailing to 9,246 families.


An outgoing mailout to 9,246 households will be conducted to obtain an estimated 943 completes. We will be requesting information from all potential respondents in the frame. We expect an approximately 10 percent response rate. This is a new data collection with no previous response rate to compare. Information aggregates will be based upon the actual response rate.


Conducting a mail survey offers substantial benefits in coverage from telephone and web-based approaches. A 2019 study found considerable gaps remain for internet access in Native American communities, as 18% of reservation residents have no internet access at home, and 33% rely on cell phone service for at-home internet [Ref 1]. Following national trends, nearly all U.S. households now rely on wireless cell phones for telephone access, but respondents reached by cell phone are less likely to respond to telephone invitations to complete surveys [Refs 2, 3]. Participants will be offered a choice to complete a paper survey with a prepaid return envelope or to complete it on the Internet, as evidence suggests younger and working adults may be more likely to select web surveys if offered the choice of survey mode [Ref 4]. Participants will also receive a reminder postcard with instructions how to complete the survey online.


2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.



Respondent universe and expected sample size

As mentioned, we will reach out to all BIE-Funded schools in the sample universe, attempting to gain cooperation for all the schools, but assuming that approximately 100 of the approximately 183 schools will participate in the study. For the participating schools, we will select all families with children attending the participating schools up to 9,246 families. If more than 9,246 families are associated with the participating schools, we will select a probability sample by stratifying the address-based sample using the nine BIE regions, by school run status (BIE Operated, Tribally Operated, Other) and by School Type (Primary, Secondary, Other) seeking to select/achieve a proportionate sample of schools across up to 81 sampling strata. Table 2 presents summary information about the sample universe of schools and families within cooperating schools as well as the assumed response rates and estimated number of completed surveys.


Sample weights will be constructed to take into effect the sample design for the survey should a sample be required as well as potential bias due to noncoverage and/or nonresponse, and other nonsampling error using post-stratification adjustments. Weighted estimates produced from the study are expected to be representative of all BIE-funded school nationally, by how operated, school type and region (collapsing smaller regions with other larger regions). Assuming a design effect of 2.0, the margin of error at the 95% confidence level for a national estimated percentage of 50% from the survey is estimated to be 4.4 percentage points.



Table 1. Sample Universe, Estimated Sampling and Response Rates


Universe of Schools

Responding Schools

Estimated Sample Universe of Families in Responding Schools

Number of Households Sampled

Response Rate of Parent for Randomly Selected Child in the Household

Estimated Number Completed Surveys by Parents

183

100

9,426

9,426

10%

943



Survey overview


The 15-minute survey contains questions about families’ experience with distance learning in the last year. BIE will collect information to understand the remote learning experience, satisfaction with technical aspects of online schooling, and perceived strengths and challenges of online learning. We will also measure other potentially important characteristics such as demographics, internet accessibility, and whether the child received special services from the school.


Procedure


The survey will use a mixed mode approach consisting of paper and web administration. This approach was chosen for several reasons, including its coverage and cost-efficiency. Respondents will receive a packet with the paper survey, self-addressed envelope, and cover letter introducing the study. Information on the cover letter will also provide information necessary for completing the survey over the web; namely the URL and a unique access code. A set of frequently asked questions will be included with the paper survey. A tracking number that contains no identifiable data will be included on each mailed questionnaire for processing purposes.


In order to encourage more respondents to complete the survey, a reminder/thank you postcard will be sent 5 days later to the households. The reminder/thank you postcard will contain information similar to the initial letter, including the URL and respondent’s unique access code. Schools will also be provided with sample text message and email content so they can promote the survey to families while it is in field.


Though telephone administration is not part of the core data collection strategy, an 800-number will be provided in all mailed materials for respondents who are unable to complete the survey on the website or on paper, or need assistance to do so. Contact materials will also include an email address where respondents can ask questions or obtain technical support.


Analysis Plan


We will test for any differences between modes (online versus mail survey) and will account for any potential mode effects in our analyses. We will use weighted data in our analysis to account for differential probabilities of selection, noncoverage, nonresponse and other nonsampling error. We will examine the frequencies for survey items and the relation between survey items and demographic characteristics. Conventional descriptive statistical techniques for survey data, such as descriptive statistics, t-tests for means comparisons, and chi-square tests for comparing distributions of survey outcomes by key demos and/or by subgroups will be used to analyze the data.


Weighting


Weighting helps to compensate for differential probabilities of selection; reduce biases due to differential non-response; and make the estimates consistent with external population totals. We will use a classical design-based approach for weighting, with the base weights constructed from the inverse of the probabilities of selection. In the perfect data collection, this scheme produces unbiased estimates and does not require any model assumptions. However, these weights must be modified because of imperfections, such as under coverage and the fact that some people do not respond to the survey. If under coverage and nonresponse are not addressed, the analysis may be biased.


The starting point in the development of the weights is the school base weight which is 1.0 since a census of schools is being fielded. The school base weight is then adjusted to account for nonresponse using variables in the school sampling frame such as BIE region, school type, and Operation status. These variables will be included in the file when the sample is selected. This nonresponse adjusted school base weight is assigned to all sampled addresses creating the address-level base weight. We adjust the address-level base weight for household nonresponse using data available from the sample frame such as the type of area (e.g., nine BIE regions, school type and Operation status). The final step is raking the weights to demographic control totals for the U.S. Native American population with children in the household. The control totals used in the raking will be current totals as reported in the American Community Survey. The variables planned for the raking are:


  • Number of school-age children in the household (1, 2, 3, 4+)

  • Gender of school-age children in the household (Male, Female, Other)

  • Household Internet Status (Yes, No)

  • Primary language spoken at home (American Indian, English, Both American Indian and English, Other)

  • Nine BIE Regions (Northwest, Rocky Mountain, Great Plains, Pacific, Western, Southwest, Midwest, Southern Plains, and Eastern)


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


To help ensure that the participation rate is as high as possible, BIE will:

  • Send multiple different contacts to participants, in different formats (e.g., mailout postcard),

  • Administer the survey in two modes (web and mail), allowing respondents to answer questions in the mode, time, and location of their choosing;

  • Encourage schools to promote the survey while it is in field;

  • Keep the length of the survey to an average of 15 minutes;

  • Design survey that minimizes burden (short in length, clearly written, and with appealing graphics).


To investigate potential bias due to nonresponse at the different stages of sampling/data collection, we will compare the socio-demographics (i.e., student gender, primary language spoken at home, household internet status, number of school-age children in the household, and attendance by region) of the completed survey data to socio-demographics available in the sample frame of BIE-Funded schools as well as to those from the ACS as appropriate. These comparisons will be used to identify additional variables that may help to further reduce any potential bias due to nonresponse that can be included in the development of final sample weights.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


No prior tests will be conducted for the survey. The survey was developed in consultation with a Technical Working Group of experts in Indian Education assembled specifically for this project. The online instrument was tested for mobile and desktop administration.


5. Provide the names and telephone numbers of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The contractor NORC will collect the data on behalf of the Bureau of Indian Education under Contract 140A1620P0112. Jennifer Hamilton, PhD, 312-201-6836, is NORC’s Project Director for this project, and Jennifer Berktold, Ph.D.,301-634-5495, is leading the survey task. The study will be overseen by Anthony Scheler, 505-967-7859, at the Bureau of Indian Education and Gregory Mehojah, 505-563-3101, at the Bureau of Indian Education.


References


  1. Howard, Brian and Morris, Traci, Tribal Technology Assessment: The State of Internet Service on Tribal Lands (July 27, 2019). Available online at https://aipi.asu.edu/sites/default/files/tribal_tech_assessment_compressed.pdf [accessed 5/14/21]

  2. Blumberg SJ, Luke JV. Wireless substitution: Early release of estimates from the National Health Interview Survey, January-June 2020. National Center for Health Statistics. February 2021. DOI: https://doi.org/10.15620/cdc:100855

  3. Qayad, M.G., C. Pierannunzi, P.P. Chowdhury, S. Hu, D.M. Town, and L.S. Balluz, “Landline and Cell Phone Response Measures in Behavioral Risk Factor Surveillance System” Survey Practice 6(3) (2013).

  4. Haan, M., Ongena, Y. P., & Aarts, K. (2014). Reaching hard-to-survey populations: Mode choice and mode preference. Journal of Official Statistics, 30(2), 355–379. https://doi.org/10.2478/jos-2014-0021



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJennifer Berktold
File Modified0000-00-00
File Created2021-06-18

© 2024 OMB.report | Privacy Policy