OMB Submission Part B 5-10-07(a)

OMB Submission Part B 5-10-07(a).doc

Northwest Regional Educational Needs Assessment

OMB: 1850-0827

Document [doc]
Download: doc | pdf









Supporting Statement for Paperwork Reduction Act Submissions

REL-NW Regional Needs Assessment Survey of Superintendents, Principals, and Teachers in a Five-State Area

OMB Control No: 1850-New, EDICS# 3237




Request for OMB Review – Part B


OMB Form 83-I and Supporting Statement for Data Collection














Submitted by:


Regional Educational Laboratory-Northwest

Portland, Oregon


May 2007








B. Collections of Information Employing Statistical Methods


B1. Respondent Universe, Sampling and Response Rate


The respondents participating in the surveys will consist of teachers of core subjects, elementary, middle and high school principals and district superintendents in the five states REL-Northwest serves.



Teachers

Principals

Superintendents

Total

Population

Sample Population

Estimated Response

(80%)

Total

Population

Sample Population

Estimated Response

(80%)

Total

Population

Sample

Population

Estimated Response

(80%)

Alaska

4,424

500

400

426

300

240

53

53

42

Idaho

8,477

500

400

601

300

240

114

114

91

Montana

6,124

500

400

487

300

240

202

202

162

Oregon

15,800

500

400

1,150

300

240

197

197

158

Washington

30,363

500

400

2,024

300

240

296

296

237

TOTAL

65,1891

2,500

2,000

4,6882

1,500

1,200

8623

8254

660


1 The number of core teachers was estimated based on data from Oregon and Idaho that shows the percentage of core teachers to be approximately 60% of all teachers (N=108,648). Source for teacher population data was NCES, Common Core of Data SY2004-05.


2 The number of principals was estimated based on number of schools in the population by state. The estimated total without double-counting shared principals and excluding schools without principals is 4,688. Source for school data was NCES, Common Core of Data SY 2004-05.


3 Superintendent data is based on the number of districts with administrative superintendents. Sources for superintendent data were: NCES Core of Common Data and SEA websites.


4 Some districts share the same superintendent which reduces the size of the total pool from 862 to 825.



B2. Data Collection Procedures


We propose to use a stratified random sampling method for the teachers and principals selected for the study. For core teachers, REL-Northwest proposes using the staff lists that are available in Oregon, Washington and Idaho. Prior experience with these lists has proven them to be reliable sample sources. In Montana and Alaska, where reliable contact information for teachers is more difficult to obtain, schools selected for participation in the study will be contacted individually to obtain lists of core teachers from which to choose study participants We will contact 325 schools in Montana and Alaska to hedge against the possibility that some schools may be unwilling to provide staff lists.


Core teachers will be selected using a two-stage stratification method. First, approximately 300 schools will be selected at random for each state. Teachers of core subjects in each of these schools will then be pooled and 500 teachers will be randomly selected to participate in the survey. To select principals for the study, a second random selection of 300 schools from each state will be made. Schools without principals will be removed from the list before the random selection occurs. Thus, while some schools may be selected for both the teacher and principal surveys, this occurrence is a function of random selection rather than by deliberate design.

The survey instrument is designed to yield a systematic, quantitative set of perceived needs for evidence that can be prioritized on the basis of needs derived from the point values teachers, principals and superintendents assign to survey items. Forced-choice items are used on an “extent of objective evidence needed” scale, coupled with open-ended items to provide opportunity for greater articulation of the decisions educators face. Results are analyzed to identify statistically significant differences among subgroups (school and respondent characteristics), as well as the relative magnitude of the overall item ratings themselves. Surveys are cross-referenced against Common Core of Data school characteristics, as well as state databases, as a rich source of contextual variables for further analyses.


Precision


With the expected response rates shown in the table above, the level of precision (maximum margin of error at the 95% confidence level) for the three sample populations is as follows:



Region

State

Teachers

± 1.92%

± 4.38%

Principals

± 2.09%

± 4.18%

Superintendents

± 1.71%

± 2.85% to ± 6.95%


Thus, given our sampling design, we anticipate being able to generalize to teachers, principals, and superintendents within the region and within each state. In addition to generalizing these groups to regional and state levels, REL-Northwest plans to look at the regional level for several subgroups including minority enrollment, free-reduced lunch eligibility and district size. The table below shows the level of precision expected for the smallest cell in each of these subgroups at the regional level.



Teachers

Principals

Superintendents

Minority Enrollment

± 3.65%

± 3.84%

± 2.44%

Free/Reduced Lunch Eligibility

± 3.23%

± 3.84%


School Level

± 10.17%

± 7.88%

---

Urban/Rural

± 3.66%

± 3.57%

---

District Poverty

---

---

± 2.46%

District Size

---

---

± 2.68%



B3. Maximizing Response Rates


This data collection is designed to meet OMB Standard 1.3 which states:


Agencies must design the survey to achieve the highest practical rates of response commensurate with the importance of survey uses, respondent burden, and data collection costs, to ensure that survey results are representative of the target population so that they can be used with confidence to inform decisions. Non-response bias analyses must be conducted when unit or item response rates or other factors suggest the potential for bias to occur.”1


As noted above, the expected response rates for all survey populations is 80%. This expectation takes several elements into consideration. First, the survey is on a topic of keen interest to educators. It provides the opportunity for educators to tell REL-Northwest directly what types of information will be most useful to them as they work to increase student achievement. It also provides them an opportunity to showcase promising practices in their own schools or districts that they believe should be evaluated for both effectiveness and transportability (e.g., the ability to be successfully replicated in other schools or districts).


A case study by Don A. Dillman found that a national survey of college graduates using four first-class mail contacts resulted in a response rate of 69%.2 Our study has been designed carefully and is modeled as closely as possible after the optimum design Dillman recommends (excluding the recommended $5 incentive). We will be using four first-class mail contacts and a final contact via priority mail and immediate follow-up by telephone to as many non-respondents as possible. Additionally, respondents will be provided with the option of responding to the survey online rather than using pencil and paper. We believe this could increase response rates by an additional 1% to 5%.


Information from the Office of Management and Budget says response rates to government surveys are typically in excess of 70% with a mean response rate of 82.2%.3 We believe that by employing Dillman’s Tailored Design Method coupled with the fact that the survey is a government sponsored research effort and of interest to respondents, we will reach the 80% unit response rate and the 70% item response rate set forth in Guidelines 1.3.4 and 1.3.5 of the September 2006 OMB “Standards and Guidelines for Statistical Surveys.”4


It should be noted that the open-ended items in Section 3 of the survey instruments are solicitations for programs and practices to be evaluated. Our assumption is that respondents who do not complete these items do not have programs or practices to recommend. Thus, these questions would be exempted from the 70% item response requirement. Response rates will be calculated for each unit using calculations defined by the American Association for Public Opinion Research (AAPOR).5


The Tailored Design Method


To maximize response rates, Dillman recommends that mail surveys include the following components:6


1) A respondent-friendly questionnaire—questions that are clear and easy to comprehend, a question order that suggests high salience to the respondent and a questionnaire layout that is in accordance with visual principles of design for comprehension and easy response.


The surveys attached to this submission have been cognitively tested with representatives of the respondent groups for comprehension and ease of response. Questions on the survey instrument are low-risk and will not discourage respondents from participating. Surveys include a limited number of questions using the same response format to reduce the amount time respondents spend reading instructions and responding to individual survey items. A graphic designer was employed to lay out the questionnaire in accordance with visual design principles. The survey instruments were designed to appear friendly, short, and easy to complete.


2) Five contacts (four by first-class mail & one special contact): a pre-notice letter, a questionnaire mailing, a thank you postcard, a replacement questionnaire, and a final contact with replacement questionnaire.


This data collection includes a personalized pre-notification letter to be sent prior to the survey mailing. The letter informs respondents they were selected for the survey, tells them of the importance of the survey and asks for their cooperation. The letter will be signed by the Principal Investigator of REL-Northwest. Following the pre-notification letter, respondents will be sent a questionnaire packet that includes a personalized cover letter, a paper survey, and instructions for completing the survey online if desired.


The cover letter explains the purpose of the project, how the data will be used (i.e., an opportunity to provide input to education policy-makers as well as to having promising practices evaluated) and how important it is for each person in the sample pool to respond. It informs respondents that their answers will be used only for statistical purposes and will not be disclosed, or used, in identifiable form for any other purpose unless otherwise required by law. The cover letter will be signed by the Principal Investigator of REL-Northwest, the entity collecting the data.


After respondents have an opportunity to complete and return the surveys, a reminder/thank-you postcard will be sent to everyone in the sample pool. The postcard serves as a thank you to those who have already completed the survey and a reminder to those who have not. A replacement questionnaire will be sent two weeks after the postcard to non-respondents in each sample group.


After respondents have had an opportunity to complete and return the replacement questionnaire, a final packet including a 3rd survey will be sent to non-respondents via USPS Priority Mail. Non-respondents for whom we have telephone numbers (primarily principals and superintendents) will receive a call the day after the Priority Mail packet is delivered to make a personal request for the respondent to complete and return the survey.


3) Return Envelopes with Real First-Class Stamps


Return envelopes will be pre-addressed and will have a first-class stamp affixed to them. Use of a first-class stamp creates a stronger social exchange because they have received something of tangible value (making it more difficult to simply recycle the survey and envelope). It also facilitates return of the completed survey.


4) Rewarding respondents in several waysby showing positive regard for the individual (e.g., “we value your experience as an educator”), by saying thank you, by asking for advice, by giving tangible rewards, through social validation (e.g., “others like yourself are participating in this important study), and by informing respondents that opportunities to provide input are scarce.


All communications with potential respondents include the intangible elements listed above.

In addition to Dillman’s recommendations, we have built flexibility into the data collection to accommodate respondents’ busy schedules and response preferences. First, because we expect most respondents to complete the survey on paper, they will have up to eight weeks to complete and return the questionnaire, allowing them to do it on their own schedule. Second, respondents will have the option of completing and submitting survey responses electronically via a secure website.


Non-response issues


In the unlikely event that survey response rates fall below 80%, REL-Northwest will conduct a non-response bias analysis comparing respondents and non-respondents on key characteristics available in the sample frame. For this data collection, key known characteristics for each group include state, district size (large/small), minority enrollment and poverty (high/low).


Non-response analysis will follow the formula set forth in OMB Guideline 3.2.9:7


Given a survey with an overall unit response rate of less than 80 percent, conduct an analysis of non-response bias using unit response rates as defined above, with an assessment of whether the data are missing completely at random. As noted above, the degree of non-response bias is a function of not only the response rate, but also how much the respondents and non-respondents differ on the survey variables of interest. For a sample mean, an estimate of the bias of the sample respondent mean is given by:


nnr – –

B(yr) = yr – yt = n (yr – ynr)


Where:

yt = the mean based on all sample cases;

yr = the mean based only on respondent cases;

ynr = the mean based only on non-respondent cases;

n = the number of cases in the sample; and


nnr = the number of non-respondent cases.”


Accuracy and Reliability


Stratified random sampling of teachers and principals and universal sampling of superintendents at the state level will allow data to be projected to each state. For regional analysis, responses will be weighted to proportionately reflect the distribution of teachers, principals and superintendents in the five state area.


Paper surveys will be edited for completeness, and accuracy in following instructions. Data will be entered manually into VOXCO, a computer assisted interviewing software program. Use of this program increases accuracy by rejecting out-of-range responses. Data entry will be 100% verified. Data from online surveys is entered directly into a VOXCO database and out-of-range responses are programmed into the online survey, thus reducing respondent error.


Our experience in conducting previous surveys is that very few are returned incomplete. In the event incomplete surveys are returned, cases will be excluded item-wise rather than list-wise. The open-ended questions in Section 3 of the survey instruments ask educators to suggest practices or programs for evaluation. The lack of a suggestion will be not considered missing data, but will be treated as a “don’t know” response.


When all data are entered, the online and paper survey databases are combined and “cleaned”, meaning inconsistent or incomplete responses are excluded from the final data set used for analysis.


As noted in B2 above, the level of precision (maximum margin of error) for the three sample populations is as follows:



Region

State

Teachers

± 1.92%

± 4.38%

Principals

± 2.09%

± 4.18%

Superintendents

± 1.71%

± 2.85% to ± 6.95%


Thus, given our sampling design, we anticipate being able to generalize to teachers, principals, and superintendents within the region and within each state.


B4. Test of Survey Procedures


Serial cognitive testing with three principals and four teachers was conducted to evaluate the format of the questions, determine the readability and the ease of understanding the instructions, and to get estimates of the time needed to complete the entire process and ultimately, the respondent burden for completing the survey. During the serial testing process, the survey was administered to one individual and based on feedback from that person, changes were made before the survey was administered to the second person. Testing continued until no more changes were indicated. The information gathered from this test was used to further refine the survey and the process before sending it to a professional graphic designer to lay out the survey in an attractive and easy to use format.


B5. Statistical Consultant


Vendor name: Gilmore Research Group

Vendor contact: Dr. Jeanne Wintz,

Vendor title: Executive Vice President, Custom Research

Role: Statistical consultant for analysis including statistical testing for significant differences between subgroups, testing for non-response bias, other tests as required.


Vendor name: Gilmore Research Group

Vendor contact: Carol Ambruso

Vendor title: Vice President—Design and Analysis

Role: Primary analyst, responsible for overseeing data collection, processing, cleaning, analysis, and reporting.

List of Attachments

Attachment A: NWREL Institutional Review Board – Letter of Exemption

Attachment B: Survey Pre-Notification Letter

Attachment C: Cover Letter to Accompany Survey Instruments

Attachment C: Superintendent Survey Instrument

Attachment D: Teacher Survey Instrument

Attachment E: Principal Survey Instrument

Attachment F: Reminder/Thank You Postcard



1 Office of Management and Budget. “Standards and Guidelines for Statistical Surveys.” September 2006. Section 1.3. p. 8.

2 Dillman, Don A. Mail and Internet Surveys: The Tailored Design Method, 2007 Update with New Internet, Visual, and Mixed-Mode Guide. 2nd Edition. John Wiley & Sons, Inc. Hoboken, NJ, 2007. pp.314-322.

3 Questions and Answers when Designing Surveys for Information Collection. Office of Information and Regulatory Affairs. Office of Management and Budget. January 2006. p. 59.

4 Office of Management and Budget. “Standards and Guidelines for Statistical Surveys.” September 2006. Section 1.3. p. 8.

5 The American Association for Public Opinion Research. “Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys”. 2006. pp. 32-33.

6 Dillman, Don A. Mail and Internet Surveys: The Tailored Design Method, 2007 Update with New Internet, Visual, and Mixed-Mode Guide. 2nd Edition. John Wiley & Sons, Inc. Hoboken, NJ, 2007. pp.15-17, 150-153.

7 Office of Management and Budget. “Standards and Guidelines for Statistical Surveys.” September 2006. Section 1.3. p. 16.


8


File Typeapplication/msword
File TitleOMB Clearance Request
AuthorCarol Ambruso
Last Modified Byjames leffler
File Modified2007-05-11
File Created2007-05-11

© 2024 OMB.report | Privacy Policy