Technology Based Learning Evaluation Supporting Statement Part B FINAL

Technology Based Learning Evaluation Supporting Statement Part B FINAL.doc

Evaluation of the Technology-Based Learning Grants

OMB: 1205-0479

Document [doc]
Download: doc | pdf

Supporting Statement for
Paperwork Reduction Act 1995 Submission:

Evaluation of Technology-Based Learning Grants

Part B. Collections of Information Employing Statistical Methods

    1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

Universe

Number

Number Surveyed

Exiting TBL Grant participants from March 2011 through October 2011

Approximately 1,500

Approximately 1,500 (All Exiting TBL participants from March 2010 through August 2010 will be surveyed)

No statistical methods will be used. All data collection will be based on a 100% sample of the inference population. In all reports and other publications and statements resulting from this work, no attempt will be made to draw inferences to any population other than the set of units that responded to the data collection effort.

Given the targeted nature of the project and the relatively small number of customers enrolled in TBL grant-funded training programs, administering the survey to the full population of customers exiting these programs after OMB clearance is obtained, and through most of the remaining evaluation period is feasible. Thus, sampling is unnecessary in this context and statistical issues relating to sample methodology are not applicable.

Regarding the response rate for this data collection, a rate of 70% is expected. Such a response rate is suitable for this collection because the goal of the survey is to assess customer satisfaction and outcomes for the respondent universe and will not be used to generalize to a larger population. While the highest possible response rate will be sought through a rigorous data collection plan (see section B3), recent literature on surveys of participants in similar technology-based education programs indicates that obtaining a 70% response rate is realistic and achievable. For example, Atreja et al. (2008) administered an online satisfaction survey to 17,891 healthcare professionals who had enrolled in web-based training course and achieved a response rate of 76%. Though utilizing a much smaller sample, DeBourgh, was able to achieve a 100% response rate in conducting a satisfaction survey of nurses in a distance-delivered course. This collection will similarly include a significant number of TBL participants in the healthcare field.

Other examples of similar studies that received 70% or higher response rates include both Drouin (2008) and Hermans et al. (2008) who achieved 92% and 93% response rates, respectively, for online satisfaction surveys administered to participants in online or web-enhanced post-secondary courses.

While this collection’s survey instrument also has a focus on outcomes, it does address similar measures of student experience and satisfaction with using technology-based learning as are investigated in each of these previous studies. Additionally, like the previous research, this collection will employ an online survey and coordinate with education providers to help encourage student participation. However, this collection proposes to go even further to increase the response rate by providing a formal pre-notification contact and using more rigorous follow up methods with non-responders.

    1. Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

  1. Sample Selection. The customer survey will be administered to all participants who exit TBL grant-funded training programs once OMB clearance is obtained (tentatively early 2011), to the end of October 2011. Therefore, since the survey is being administered to all members of the universe, no statistical methodology will be employed for a sample selection.

  2. Estimation Procedures. The analysis of the survey data will make use of frequency distributions, means and cross-tabulations, which will provide basic information about responding customers’ experiences in and satisfaction with the TBL programs. The relationship between certain TBL program features and participant outcomes will also be examined using regression analysis.

The size of the respondent universe will ensure there are a sufficient number of respondents to conduct subgroup analyses. Standard formulas for comparing differences in means and proportions will be used. In addition, standard regression techniques will be used to determine the presence of a relationship between specific program components and participant outcomes. Thus, for continuous dependent variables (e.g. wages) ordinary least squares regression analysis will be employed, as given by:

y = α + β1x1 + β2x2+…βnxn + e,

in which y is the outcomes, α is the constant (representing the intercept), β’s represent the regression coefficients for the corresponding x (independent) terms, and e represents the error term reflected in the residuals.

For analysis of binomial dependent variables (e.g. completed training, obtained employment), logistic regression techniques will be used, as given by:

z = α + β1x1 + β2x2 +…βkxk,

in which z is the logit (log odds) of the dependent variable, α is the constant, and β’s represent the logistic regression coefficients for the corresponding x (independent) terms.

  1. Degree of Accuracy. Because sampling will not be employed, results should be an accurate reflection of the relevant universe, subject to the constraints of reporting error and non-response bias. The contractor will gather data on some basic demographic characteristics of program participants from grantee administrative data systems, which will be used to suggest whether non-respondents differ in any substantial way from respondents. Sample results are generalizable only to the characteristics and outcomes of the 1,500 TBL program participants who are estimated to exit the program during the evaluation period.

  2. Unusual Problems. There are no unusual problems for this data collection requiring specialized sampling procedures.

  3. Periodic Data Collection. This survey will only be administered once to each respondent.

    1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

  1. Methods for Maximizing Response Rate

A mixed-mode strategy based on the Total Design Method will be used to achieve a high response rate to the customer survey (Dillman, 2000). Recent literature suggests that when utilizing web-based surveys, a mixed-modes approach is useful in accessing non-responders and obtaining a higher response rates (Converse et al., 2008; de Leew, 2005; Shih & Fan, 2007). For this study, a web-based survey will be administered first because potential respondents, by virtue of participating in TBL training, will likely have access to the Internet and have a basic level of comfort using web-based tools. In addition, administering the web-based survey first--followed later by mail and telephone contacts, if necessary--will keep costs down, as it is the least expensive mode of contact.

Another method for increasing the response rate will be to provide advance notification about the survey. The contractor will send out an advance notification letter to potential respondents. This letter will describe the purpose and sponsorship of the survey, and will assure potential respondents that the forthcoming email, with the embedded link to the survey website, is not email spam and that the survey is for a legitimate research project. ETA’s contractor will work with grantees to obtain customers’ contact information and coordinate the distribution of these letters near the end of potential respondents’ training programs. In addition to the pre-notification letter, the contractor will encourage grantees to have TBL program instructors notify their students that a customer satisfaction survey will be sent to them shortly after they exit training.

Following participants’ estimated date of exit from the program, the contractor will send potential respondents an email with the embedded link to the survey. The email will reference the pre-notification letter, give a brief description of the survey and the incentive, give instructions for accessing the survey online, and include contact information for the contractor for questions regarding the survey. To access the survey from the email, respondents will need only to click on the web link.

Customers that have not responded within one week will receive a reminder email encouraging them to complete the survey, which will again include the embedded web link to the survey. Two weeks following this contact, non-responding customers will be mailed a paper copy of the questionnaire which will include a cover letter about the survey and the web link so that respondents may choose to respond via the enclosed hard-copy questionnaire or via the internet. Following this contact, grantees will receive another reminder e-mail and a reminder postcard. Finally, if customers do not respond to these last reminders, they will be telephoned directly in an effort to obtain their responses.

It is expected that using the above strategies, combined with the $15 gift card, will yield a high response rate.

  1. Addressing Nonresponse

During the course of the evaluation, the contractor will coordinate with grantees in an attempt to obtain administrative data on TBL customers, including basic demographic information. When the surveys of TBL customers are completed, and with access to sufficient information on customers from administrative data records, the contractor will conduct an analysis of nonresponses to assess whether survey non-responders differ significantly from respondents. If the respondent sample differs significantly from non-respondents, sample weights will be developed.

    1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.

ETA’s contractor has pretested the survey with no more than nine respondents. The pretests have helped assess the clarity of content and wording of the surveys, organization and formatting of the questionnaire, ease of online administration, respondents’ burden time, and potential sources of error. The information gathered from the pretesting has been used to modify the questionnaire as appropriate.

    1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

ETA’s contractor for this study, Social Policy Research Associates will conduct the analysis. Contact information for key staff and reviewers of this material is provided below.

Name

Affiliation

Telephone Number

Kate Dunham, Social Scientist

Social Policy Research Associates

510.763.1499 x 635

Principal Investigator




References

Atreja, A., N.B. Mehta, A.K. Jain, C.M. Harris, H. Ishwaran, M. Avital, and A.J. Fishleder. “Satisfaction with Web-based Training in an Integrated Healthcare Delivery Network: Do Age, Education, Computer Skills and Attitudes Matter?” BMC Medical Education, 2008, 8:48.

Cobanoglu, C. and N. Cobanoglu. “The Effects of Incentives in Web Surveys: Application and Ethical Considerations.” International Journal of Market Research, Vol. 45, quarter 4, 2003: pp. 475-488.

Converse, P.D., E.W. Wolfe, X. Huang, and F.L. Oswald. “Response Rates for Mixed-Mode Surveys Using Mail and E-mail/Web.” American Journal of Evaluation OnlineFirst, doi: 10.1177/1098214007313228, 2008.

DeBourgh, G.A., “Predictors of Student Satisfaction in Distance-Delivered Graduate Nursing Courses: What Matters Most?” Journal of Professional Nursing, Vol. 19, No. 3, 2003: pp. 149-163.

De Leew, E.D. “To Mix or Not to Mix Data Collection Modes in Surveys.” Journal of Official Statistics, Vol. 21, No. 2, 2005: pp. 233-255.

Dillman, D.A. Mail and Internet Surveys: The Tailored Design Method. 2nd ed. New York, NY: John Wiley and Sons, Inc., 2000.

Drouin, M.A. “The Relationship Between Students’ Perceived Sense of Community and Satisfaction, Achievement, and Retention in an Online Course.” Quarterly Review of Distance Education, Vol. 9, Issue 3, 2008, pp. 267-284.

Göritz, A.S. “Incentives in Web Studies: Methodological Issues and a Review.” International Journal of Internet Science, 1 (1), 2006: pp. 58-70.

Hermans, C.M., D.L. Haytko, and B. Mott-Stenerson. “Student Satisfaction in Web-enhanced Learning Environments.” Journal of Instructional Pedagogies, Vol. 1, 2008.

O’Neil, K.M., S.D. Penrod, and B.H. Bornstein. “Web-based Research: Methodological Variables’ effects on dropout and Sample Characteristics.” Behavior Research Methods, Instruments, & Computers, 2003, 35 (2): pp. 217-226.

Shih, T. and X. Fan. “Response Rates and Mode Preferences in Web-Mail Mixed-Mode Surveys: A Meta-Analysis.” International Journal of Internet Science, 2 (1), 2007: pp. 59-82.



5



File Typeapplication/msword
File TitleSUPPORTING STATEMENT FOR
Authorennis.michelle
Last Modified Bynaradzay.bonnie
File Modified2011-01-05
File Created2011-01-05

© 2024 OMB.report | Privacy Policy