Att_LRC Supporting Statement Part B11.18.09

Att_LRC Supporting Statement Part B11.18.09.doc

IEPS LRC Customer Surveys

OMB: 1840-0809

Document [doc]
Download: doc | pdf


B. Description of Statistical Methods

1. Universe and Respondent Selection

All three surveys proposed in this study are targeting the entire universe of respondents (i.e., census). This is because the number of cases is small, and sampling would not be appropriate.

The frame for LRC Project Directors is derived from the U.S. Department of Education’s International Resource Information System (IRIS) and includes all 16 grantees who received LRC funding since program inception. The frame for the NADSFL survey will be obtained from the Executive Director of that organization and will contain the full list of 2009 members (approximately 300 members). The frame for Summer Workshop participants will be obtained from the LRC Project Directors based on their records of who attended the 2009 Summer workshops and was employed by or enrolled in an institution of higher education (approximately 200 participants across all 16 LRCs).

These frames are as exhaustive as frames can be in that they represent the entire population of interest. Because membership in NADSFL and enrollment in Summer Workshop is voluntary, it is possible that the universe of participants is not similar to the larger population of users of LRC products across the United States. However, because there is no such frame and the population of LRC product users is undefinable, this limitation will be noted in the final report.



2. Procedures for Collecting Information

The information for this study will be gathered through three surveys (see Appendix A). Two of these surveys (NADSFL and Summer Workshop participants) will be administered via the World Wide Web using Survey Gizmo, a commercial survey dissemination and management website. Because of the small number of respondents (N=16), the LRC Project Director survey will be administered as an electronic Word document disseminated via email. All respondents will be able to request a pencil and paper version of the study.

Respondents to all three surveys will be contacted about the study via email. AIR has already worked with LRC Project Directors and the NADSFL Executive Director to publicize the study and emphasize its importance to respondents. All respondents will receive a prenotification email about the study emphasizing its importance and requesting their participation. Three to five email follow-ups over a period of two months (depending on response rates monitored on a weekly basis) will follow up with respondents to encourage participation (see Appendix B for contact and follow-up emails).



3. Methods to Maximize Response Rates

As discussed above, methods to maximize response rates have already been implemented by publicizing the study early on to respondents via email to Project Directors and members of NADSFL. In addition, the follow-up procedures discussed above will be implemented to follow-up with respondents who have not filled out the surveys.

Because the study has been well publicized already and the respondents are professionals with a vested interest in the topic of the study and in the future of the LRC program as disseminators of foreign language resources, we expect a high response rate.

It is expected that the response rate for the LRC Project Directors will be 100 percent because they are all currently grantees of the U.S. Department of Education with a vested interest in the study.

The expected response rates for the NADSFL members’ survey and the Summer workshop attendees is 85 percent. This response rate will be achieved through a combination of contact letters emphasizing the importance of the project and endorsed by influential scholars in the field of foreign language teaching, and the multiple follow-ups from the contractor mentioned above.

Non-response bias due to a less than 100% response rate will be examined by comparing the background demographic information available on the frame (e.g., gender, institutional affiliation, region of the country) to that of the respondents. While no statistical adjustments are planned given we are using a census, an analysis reporting on the frame totals as compared to the respondent totals will be reported in the final report and appropriate caution in the interpretation of findings will be discussed.



4. Tests of Procedures

Because the questions in these surveys capture implementation and are process-oriented, they have not been cognitively tested. They have, however, been vetted by a committee of experts, and were developed, in part, based on a review of similar questionnaires used previously by the U.S. Department of Education.

Usability of the website for the surveys will be examined by AIR’s subcontractor Firepig Partners who specializes in on-line survey design and administration. Based on their recommendations, revisions to the web designs will be implemented.



5. Contacts for Statistical Aspects and Data Collection

The following individuals were involved in the design and statistical aspects of the study and its data collection:

  • Stephane Baldi, Principal Research Scientist, American Institutes for Research

  • Tanya Taylor, Research Analyst, American Institutes for Research

  • Nina Garrett – Former Director of the Yale University Center for Language Study

  • Paul Sandrock – Assistant Director for the Content and Learning Team at the Wisconsin Department of Public Instruction

  • Richard Tucker – Professor in Applied Linguistics in the Department of Modern Languages at Carnegie Mellon University

  • Scott McGinnis – Academic Advisor and Professor at the Defense Language Institute.


3


File Typeapplication/msword
AuthorAuthorised User
Last Modified ByAuthorised User
File Modified2010-01-20
File Created2010-01-20

© 2024 OMB.report | Privacy Policy