OMB Submission Part B

OMB Submission Part B.doc

Northwest Regional Educational Needs Assessment

OMB: 1850-0827

Document [doc]
Download: doc | pdf









Supporting Statement for Paperwork Reduction Act Submissions

REL-NW Regional Needs Assessment Survey of Superintendents, Principals, and Teachers in a Five-State Area

OMB Control No: 1850-New, EDICS# 3237




Request for OMB Review – Part B


OMB Form 83-I and Supporting Statement for Data Collection














Submitted by:


Regional Educational Laboratory-Northwest

Portland, Oregon


January 2007








B. Collections of Information Employing Statistical Methods


B1. Respondent Universe, Sampling and Response Rate


The respondents participating in the surveys will consist of teachers of core subjects, elementary, middle and high school principals and district superintendents in the five states REL-Northwest serves.



Teachers

Principals

Superintendents

Total

Population

Sample Population

Estimated Response

(80%)

Total

Population

Sample Population

Estimated Response

(80%)

Total

Population

Sample

Population

Estimated Response

(80%)

Alaska

3,878

1,200

960

526

526

421

53

53

42

Idaho

7,135

1,200

960

736

700

560

114

114

91

Montana

5,112

1,200

960

868

700

560

196

196

157

Oregon

13,716

1,200

960

1,290

700

560

198

198

158

Washington

26,563

1,200

960

2,340

700

560

296

296

237

TOTAL

56,4041

6,000

4,800

5,0002

3,326

2,661

8253

8253

660*


1 Numbers of core teachers was estimated based on data from Oregon and Idaho that shows the percentage of core teachers to be between 50% and 53% of all teachers (N=112,805). Source for teacher population data was NCES, Common Core of Data, preliminary release (SY2004-05).


2 Principals estimated based on number of schools in the population by state. The estimated total without double-counting shared principals and excluding schools without principals is approximately 5,000. Source for school data was NCES, Common Core of Data, preliminary release.


3 Superintendent data is based on the number of districts with administrative superintendents. The estimated total without double-counting shared superintendents is 825. 80% of 825 is 660. Source: SEA websites.



B2. Data Collection Procedures


We propose to use a stratified random sampling method for the teachers and principals selected for the study. For each state, a list of core teachers will be generated, randomized, and then a sample of 1,200 will be selected. Principals will be selected in a similar manner. For Montana, Oregon, and Washington, a list of principals in each state will be generated (after eliminating shared principals), randomized and a sample of 700 principals per state will be selected. Due to the small number of principals in Alaska and Idaho, all principals in those states will be included in the study. All superintendents in all five states will be included in the study.


The survey instrument is designed to yield a systematic, quantitative set of perceived needs for evidence that can be prioritized on the basis of needs derived from the point values teachers, principals and superintendents assign to survey items. Forced-choice items are used on an “extent of objective evidence needed” scale, coupled with open-ended items to provide opportunity for greater articulation of the decisions educators face. Results are analyzed to identify statistically significant differences among subgroups (school and respondent characteristics), as well as the relative magnitude of the overall item ratings themselves. Surveys are cross-referenced against Common Core of Data school characteristics, as well as state databases, as a rich source of contextual variables for further analyses.


Precision:


With the expected response rates shown in the table above, the level of precision (maximum margin of error at the 95% confidence level) for the three sample populations is as follows:


Teachers: +/- 1.41%

Principals: +/- 1.90%

Superintendents: +/- 3.81%


Thus, given our sampling design, we anticipate being able to generalize to teachers, principals, and superintendents within the region and within each state.



B3. Maximizing Response Rates


This data collection is designed to meet OMB Standard 1.3 which states:


Agencies must design the survey to achieve the highest practical rates of response commensurate with the importance of survey uses, respondent burden, and data collection costs, to ensure that survey results are representative of the target population so that they can be used with confidence to inform decisions. Nonresponse bias analyses must be conducted when unit or item response rates or other factors suggest the potential for bias to occur.”1


Response Rates


As noted above, the expected response rates for all survey populations is 80%. Every effort will be made to reach the 80% unit response rate and the 70% item response rate set forth in Guidelines 1.3.4 and 1.3.5 of the September 2006 OMB “Standards and Guidelines for Statistical Surveys.”2 Response rates will be calculated for each unit using calculations defined by the American Association for Public Opinion Research (AAPOR).3


The data collection was designed to maximize response rates using proven techniques recommended by Don A. Dillman, a nationally recognized expert in the field of social research using mail and internet survey methods. To maximize response rates, Dillman recommends that mail surveys include the following components:4


1) A respondent-friendly questionnaire—questions that are clear and easy to comprehend, a question order that suggests high salience to the respondent and a questionnaire layout that is in accordance with visual principles of design for comprehension and easy response.


The surveys attached to this submission have been cognitively tested with representatives of the respondent groups for comprehension and ease of response. Questions on the survey instrument are low-risk and will not discourage respondents from participating. Surveys include a limited number of questions using the same response format to reduce the amount time respondents spend reading instructions and responding to individual survey items. A graphic designer was employed to lay out the questionnaire in accordance with visual design principals. The survey instruments were designed to appear friendly, short, and easy to complete.


2) Four contacts by first-class mail: a pre-notice letter, a questionnaire mailing, a thank you postcard, a replacement questionnaire, and a final contact.


This data collection includes a pre-notification letter to be sent prior to the survey mailing. The letter informs respondents they were selected for the survey, tells them of the importance of the survey and asks for their cooperation. The letter will be signed by the Principal Investigator of REL-Northwest. Following the pre-notification letter, respondents will be sent a questionnaire packet that includes a cover letter, a paper survey, and a modest incentive to complete the survey.


The cover letter explains the purpose of the project, how the data will be used (i.e. an opportunity to provide input to education policy-makers as well as to having promising practices evaluated) and how important it is for each person in the sample pool to respond. It informs respondents that their answers will be used only for statistical purposes and that reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. The cover letter will be signed by the Principal Investigator of REL-Northwest, the entity collecting the data.


After respondents have an opportunity to complete and return the surveys, a reminder/thank-you postcard will be sent. The postcard serves as a thank you to those who have already completed the survey and a reminder to those who have not. A replacement questionnaire will be sent two weeks after the postcard to units with low response rates. A final thank-you postcard will be sent two weeks after the replacement questionnaire.


3) Return Envelopes with Real First-Class Stamps


Although Dillman recommends using first-class stamps to create a stronger social exchange, the time involved in affixing stamps to more than 20,000 envelopes was deemed too costly for the differential in response rates between using real stamps and business reply. All respondents will receive pre-paid business reply envelopes to encourage prompt survey returns.


4) Rewarding respondents in several waysby showing positive regard for the individual (e.g., “we value your experience as an educator”), by saying thank you, by asking for advice, by giving tangible rewards, through social validation (e.g., “others like yourself are participating in this important study), and by informing respondents that opportunities to provide input are scarce.


All communications with potential respondents include the intangible elements listed above. In addition, each of the initial questionnaire packets will include a $5 bill to provide a tangible reward for their participation and to establish a social contract, likely to result in a higher overall response.


In addition to Dillman’s recommendations, we have built flexibility into the data collection to accommodate respondents’ busy schedules and response preferences. First, because we expect most respondents to complete the survey on paper, they will have up to eight weeks to complete and return the questionnaire, allowing them to do it on their own schedule. Second, respondents will have the option of completing and submitting survey responses electronically via secure website.


Non-response issues:


In the unlikely event that survey response rates fall below 80%, REL-Northwest will conduct a non-response bias analysis comparing respondents and non-respondents on key characteristics available in the sample frame. For this data collection, key known characteristics for each group include state, district size (large/small), minority enrollment and poverty (high/low).


Non-response analysis will follow the formula set forth in OMB Guideline 3.2.9:5


Given a survey with an overall unit response rate of less than 80 percent, conduct an analysis of non-response bias using unit response rates as defined above, with an assessment of whether the data are missing completely at random. As noted above, the degree of non-response bias is a function of not only the response rate, but also how much the respondents and non-respondents differ on the survey variables of interest. For a sample mean, an estimate of the bias of the sample respondent mean is given by:


nnr – –

B(yr) = yr – yt = n (yr – ynr)


Where:

yt = the mean based on all sample cases;

yr = the mean based only on respondent cases;

ynr = the mean based only on non-respondent cases;

n = the number of cases in the sample; and


nnr = the number of non-respondent cases.”



Accuracy and reliability:


Stratified random sampling of teachers and principals and universal sampling of superintendents at the state level will allow data to be projected to each state. For regional analysis, responses will be weighted to proportionately reflect the distribution of teachers, principals and superintendents in the five state area.


Paper surveys will be edited for completeness, and accuracy in following instructions. Data will be entered manually into VOXCO, a computer assisted interviewing software program. Use of this program increases accuracy by rejecting out-of-range responses. Data entry will be 100% verified. Data from online surveys is entered directly into a VOXCO database and out-of-range responses are programmed into the online survey, thus reducing respondent error. When all data are entered, the online and paper survey databases are combined and “cleaned”, meaning inconsistent or incomplete responses are excluded from the final data set used for analysis.


As noted in B2 above, the level of precision (maximum margin of error) for the three sample populations is as follows:


Teachers: +/- 1.41%

Principals: +/- 1.90%

Superintendents: +/- 3.81%


Thus, given our sampling design, we anticipate being able to generalize to teachers, principals, and superintendents within the region and within each state.



B4. Test of Survey Procedures


Serial cognitive testing with 3 principals and 4 teachers was conducted to evaluate the format of the questions, determine the readability and the ease of understanding the instructions, and to get estimates of the time needed to complete the entire process and ultimately, the respondent burden for completing the survey. During the serial testing process, the survey is administered to one individual and based on feedback from that person, changes are made before the survey is administered to the second person. Testing continues until no more changes are indicated. The information gathered from this test was used to further refine the survey and the process before sending it to a professional graphic designer to lay out the survey in an attractive and easy to use format.



B5. Statistical Consultant


Vendor name: Gilmore Research Group

Vendor contact: Dr. Jeanne Wintz,

Vendor title: Executive Vice President, Custom Research

Role: Statistical consultant for analysis including statistical testing for significant differences between subgroups, testing for non-response bias, other tests as required.


Vendor name: Gilmore Research Group

Vendor contact: Carol Ambruso

Vendor title: Vice President—Design and Analysis

Role: Primary analyst, responsible for overseeing data collection, processing, cleaning, analysis, and reporting.



List of Attachments

Attachment A: NWREL Institutional Review Board – Letter of Exemption

Attachment B: Survey Pre-Notification Letter

Attachment C: Cover Letter to Accompany Survey Instruments

Attachment C: Superintendent Survey Instrument

Attachment D: Teacher Survey Instrument

Attachment E: Principal Survey Instrument

Attachment F: Reminder Postcard



1 Office of Management and Budget. “Standards and Guidelines for Statistical Surveys.” September 2006. Section 1.3. p. 8.

2 Ibid.

3 The American Association for Public Opinion Research. “Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys”. 2006. pp. 32-33.

4 Dillman, Don A. Mail and Internet Surveys: The Tailored Design Method, 2007 Update with New Internet, Visual, and Mixed-Mode Guide. 2nd Edition. John Wiley & Sons, Inc. Hoboken, NJ, 2007. pp.15-17, 150-153.

5 Office of Management and Budget. “Standards and Guidelines for Statistical Surveys.” September 2006. Section 1.3. p. 16.


7


File Typeapplication/msword
File TitleOMB Clearance Request
AuthorCarol Ambruso
Last Modified Byjames leffler
File Modified2007-02-20
File Created2007-02-20

© 2024 OMB.report | Privacy Policy