ELMC Supporting Statement PART A_110411_FINAL[1]

ELMC Supporting Statement PART A_110411_FINAL[1].doc

Evaluation of Early Learning Mentor Coaches (ELMC) Grants

OMB: 0970-0399

Document [doc]
Download: doc | pdf








The Evaluation of Early Learning Mentor Coaches (ELMC) Grants


Supporting Statement for OMB Clearance Request
Part A






June 2011, updated November 4, 2011





Part A

A. Justification

This section provides supporting statements for each of the eighteen points outlined in Part A of the OMB guidelines, for the collection of information in the Evaluation of Early Learning Mentor Coaches (ELMC) Grants. The ELMC grants were funded in September 2010 by the Office of Head Start to 131 Head Start (HS) and Early Head Start (EHS) grantees in 42 states and the District of Columbia.1 The aim of these grants is to support the use of a grantee-defined mentor-coaching model to improve early childhood teaching practices that lead to positive outcomes. The collection of information for the evaluation of the ELMC grants will be carried out immediately following OMB clearance (estimated November 2011) to (1) describe the implementation of the ELMC grants in HS and EHS programs, (2) examine the implementation quality of ELMC efforts, and (3) examine factors that appear to be related to successful mentor-coaching. Data collected from this evaluation will be an important first step in developing profiles of mentor-coaching approaches to inform policy, practice, and research examining mentor-coaching. Documenting the implementation of the ELMC grants will lay the groundwork for future evaluations of mentor-coaching initiatives. Finally, the information collected in this evaluation will contribute to the knowledge base of the early childhood field, both in research and professional development.


A1. Circumstances Making the Collection of Information Necessary

The Administration for Children and Families (ACF), Office of Planning, Research, and Evaluation (OPRE) in the U.S. Department of Health and Human Services (HHS) seeks Office of Management and Budget (OMB) clearance for the data collection activities described in this request to support an evaluation being undertaken of the new ELMC initiative for ACF under contract by the American Institutes for Research (AIR) and their subcontractors, MEF Associates, NORC, and a team of leading technical experts. The ELMC evaluation will produce descriptive information that describes ELMC grant implementation, examines implementation quality, and examines the factors that appear to be related to successful mentor-coaching to provide profiles of promising mentor-coaching programs. This submission requests clearance for a four-month period of data collection from 130 ELMC grantees. This section provides an overview of the evaluation and discusses its objectives and the need for the proposed information collection.


Many studies have shown that early childhood education programs have been effective in narrowing early achievement gaps,2 but to ensure program success, effective professional development for teachers – including training and mentor-coaching – is critical.3 Historically, mentor-coaching across early childhood settings, including both infant/toddler and preschool settings, has taken many forms. Research suggests that there is a great deal of variation in what is referred to as mentoring or coaching, that it can refer to any number of professional development strategies,4 and that there is no existing typology of an early childhood mentor-coach. Nevertheless, research has shown effects of various types of mentor-coaching on teaching5 and child outcomes.6 However, less attention has been paid to the implementation components, quality aspects, and successful factors of mentor-coaching programs. There is a need to understand the characteristics of professional development programs, characteristics of educators, and the contextual factors that might affect the effectiveness of professional development interventions like mentor-coaching programs within research and evaluation studies.7


The lack of a clear typology of mentor-coaching in early care and education makes both program implementation and evaluation challenging. The need for information about what methods of providing professional development are most promising for improving early childhood programs, teacher practice and child outcomes is particularly great. In addition, information is needed to provide guidance and support for improving similar professional development efforts. There is an emerging consensus on a set of dimensions that may assist in defining mentor-coaching (see Appendix A1); understanding where a mentor-coaching initiative falls on these dimensions is important when programs vary across sites and settings, like with the ELMC grants. The ELMC grants’ variety reflects the urgent need to identify a typology, a conceptual framework and strategies that define a mentor-coaching system to support early care and education teaching staff to further develop their skills. This evaluation is a first step toward building the knowledge that is needed for programs to achieve the overarching professional development goal of promoting positive sustained outcomes for young children.


Legal or Administrative Requirements that Necessitate the Collection

There are no legal or administrative requirements that necessitate the collection. ACF is undertaking the collection at the discretion of the agency.


Evaluation Objectives

The overall objective of the ELMC evaluation is to identify the critical aspects of the ELMC grant initiative by: (1) describing the implementation of the ELMC grants in HS and EHS programs; (2) examining the implementation quality of ELMC efforts; and (3) examining factors that appear to be related to successful mentor-coaching. The evaluation will target current recipients and beneficiaries of the ELMC grants, including HS and EHS administrators, mentor-coaches, and staff who are mentor-coached, across the 130 ELMC grantees.


Research questions to be addressed by the evaluation include:

  • Implementation: What are key features of the mentor-coaching program model/approach? How is the grantee structuring the mentor-coaching initiative? What perceived effect did these grant funds have on the grantee, program, and staff who were mentor-coached? What are grantees doing to integrate and sustain mentor-coaching beyond the grant period?

  • Quality: Is mentor-coaching being implemented as expected (and according to the program model, if a particular model is being used)? To what extent are the nature and extent of implementation of mentor-coaching consistent with the planned approach? What is the quality of the mentor-coaching as it is being implemented on the ground? Are staff who are mentor-coached ready, willing, and able to participate in the mentor-coaching as expected? (How) Are staff who are mentor-coached changing their behavior as a result of coaching?

  • Important Factors: What were the successes and challenges in implementing the mentor- coaching? What role did characteristics of mentor-coaches, features of the mentor-coaching approach, the quality of the staff who were mentor-coached, and mentor-coach relationship, behaviors and attitudes of the staff who are mentor-coached, and contextual factors play in the implementation and the perceived success of the mentor-coaching? What do mentor-coaches and staff who are mentor-coached perceive to be the most critical elements of successful mentor-coaching regarding basic, structural, procedural, and quality dimensions? What did staff who are mentor-coached find the most and the least helpful about mentor-coaching?


A2. Purpose and Use of the Information Collection

Achieving the objectives of the ELMC evaluation requires a comprehensive data collection strategy across the administrative, mentor-coach, and staff who are mentor-coached layers of the ELMC grantees. To do this, we proposed to engage in an evaluation employing the following key information collection activities:


  • Reviewing grantee proposals and gathering publicly-available Program Information Report (PIR) data on each grantee

  • Recruiting each of the 130 grantees to participate in the implementation evaluation

  • Administering a web-based grantee census survey to each of the 130 grantees

  • Administering a web-based mentor-coach census survey to each of the mentor-coaches across the 130 grantees

  • Conducting telephone interviews with a sampled selection of mentor-coaches and staff who are mentor-coached from 65 sampled grantees

  • Analyzing the data collected and preparing a report based on the results


OPRE requests clearance for the information collection activities outlined above. The data collection instruments are described in the Data Collection section of Part B of this package. The data gathered through the information collection will be used by ACF and the contractor staff to develop a report that describes the implementation of the grants, the quality of those implementation efforts, and the factors that may be related to successful mentor-coaching. Not collecting the information would diminish the degree to which we are able to explain the implementation of the grants and identify the important aspects of mentor-coaching in early childhood programs. The report will include a conceptual framework and typology of mentor-coaching in early childhood which will strengthen professional development policy and practice efforts and improve mentor-coaching research in the field.


A3. Use of Improved Information Technology and Burden Reduction

The grantee census survey and the mentor-coach census survey will be administered online using a secured, web-based software program (Vovicci), with the use of unique links for each grantee and each mentor-coach invited to participate. This technology enables respondents to complete the survey at a time and place that is most convenient for them, and also enables respondents to complete the survey in one or multiple sittings. The evaluation will also include telephone interviews with a sampled selection of mentor-coaches and staff who are mentor-coached. These telephone interviews will allow the evaluation team to ask questions that require more in-depth responses from respondents and give the opportunity to clarify ambiguous or conditional responses.


A4. Efforts to Identify Duplication and Use of Similar Information

Because the ELMC grant program is a new initiative, there are no cross-site data collection efforts. Although there have been previous federal mentoring, coaching, and consultation initiatives, this federal implementation evaluates program-designed efforts to change teaching behavior through the use of mentor-coaches. No existing data sources can provide this information.


A5. Impact on Small Businesses or Other Small Entities

Not applicable. No small businesses or entities will be involved as respondents.


A6. Consequences of Collecting Information Less Frequently

All information collection activities will be conducted only once, during the brief project period of the ELMC grants. Not collecting this information would limit the utility of the ELMC grant initiative, because there would be no way to describe the implementation of the grants, the quality of the implementation or the factors that are associated with successful mentor-coaching initiatives. Identifying this information informs the design of future mentor-coaching initiatives in early childhood programs and provides the basis for evaluating such initiatives.


A7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances for the proposed data collection.


A8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13 and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to collect information as part of an implementation evaluation of the Early Learning Mentor-Coach Initiative. This notice was published on November 19, 2010, Volume 75, Number 223, page 70931, and provided a sixty-day period for public comment. A copy of this notice is included in Appendix A2. During the notice and comment period, the government received comments from two sources; comments provided by the first source did not relate to the instruments or data collection. Comments from the second source (source #2) and the contracting team’s responses are summarized below.


Source #2, Comment #1: The concern related to whether the information collection was necessary and encouragement to use information technology to minimize burden. Response to Source #2, Comment #1: Care was taken in the development of the data collection protocols to ensure that costs were minimized while maximizing the resulting information. Extant data will be reviewed to ensure that duplicated data are not collected. Additionally, care was taken to ensure that the respondent burden required to complete each protocol aligns with the information required to address the purposes of the evaluation and to develop profiles of mentor-coaching approaches to inform policy, practice, and research examining mentor-coaching. All surveys and the snapshot will be collected using a secured, web-based software program (Vovicci), and all interviews will be conducted on the telephone.


Source #2, Comment #2: The concern related to the quality of the measurement tools that will be used, particularly regarding measurement relationship quality. Response to Source #2, Comment #2:

To ensure the evaluation team developed the most appropriate data collection protocols, they were vetted with the evaluation’s technical experts, the consultant group (which includes practitioners), and Federal staff from OHS and OPRE. Earlier versions of the protocols have also been pilot tested with Head Start and Early Head Start staff.


In addition to members of the evaluation team, and OHS and OPRE staff, technical experts and the consultant group members of the evaluation of the ELMC grants were contacted for advice on aspects of the evaluation design and data collection instruments. Their feedback was obtained through web-based online meetings and telephone conversations. The affiliations of the technical experts and consultant group members are listed in Exhibit 1.


Exhibit 1. Technical Experts and Consultant Group Members of the Evaluation of the ELMC Grants

Technical Experts

Name

Affiliation

Bridget Hamre

Associate Director of Center for Advanced Study of Teaching and Learning, University of Virginia

Carol Vukelich

Professor in Teacher Education and Director of Delaware Center for Teacher Education, University of Delaware

Samantha Wulfsohn

Early Childhood Consultant and Licensed Psychologist, New York City



Consultant Group

Name

Affiliation

Douglas Powell


Distinguished Professor of Developmental Studies, Purdue University


Helen H. Raikes

Professor of Child, Youth, and Family Studies, University of Nebraska-Lincoln

Carol M. Trivette


Co-Director, Orelena Hawks Puckett Institute


Barbara Hanna Wasik

Professor in School of Education and Director of Center for Home Visiting, University of North Carolina

Martha Zaslow

Senior Scholar, Child Trends and Director, Society for Research in Child Development Policy and Communications Office


To date, we have had two full meetings with the technical experts and consultant group members (May 2011, July 2011) to provide input and advice on the evaluation’s conceptual model, research questions, study design, methodology, and data collection protocols. These materials were developed by the evaluation team with input from OPRE, all of whom have substantial knowledge in the key constructs to be measured as a part of the ELMC evaluation. At least one more meeting is scheduled to occur in the remainder of the 18-month evaluation period to provide additional input and advice on data analysis and approaches for reporting findings as they emerge. Project staff also will use the technical experts individually for consultation on an as-needed basis.


A9. Explanation of Any Payment or Gift to Respondents

The ELMC evaluation is using gift certificates instead of cash because all contacts will be over the internet and on the telphone. The ELMC evaluation will provide electronic gift certificates to the following respondents to boost the response rate and to compensate for respondents’ time as follows:

  • Grantee Census Survey: respondents will not receive an incentive

  • Mentor-Coach Census Survey: each respondent will receive a $20 electronic gift certificate

  • Mentor-Coach Interview: each respondent will receive a $25 electronic gift certificate

  • Staff Interview: each respondent will receive a $25 electronic gift certificate


We propose to deliver the incentives electronically from a company that offers supplies appropriate to infants, toddlers, and preschoolers that respondents can purchase to use for their work. Generally, gifts are effective incentives to participants. Response rates have been found to be significantly higher in surveys where a monetary or gift incentive was offered (Szele´nyi, Bryant, & Lindholm 20058; Brick, Hagedorn, Montaquila, Roth & Chapman 20069). Gift certificates are widely accepted and have been found useful in survey research (e.g., Kay, Boggess, Selvavel, & McMahon, 200110). Additionally, gift certificates have found a place as a form of incentive pay in a wide-range of survey research, including research with Head Start grantees (e.g., Head Start Impact Study Third Grade Follow-up; Head Start Impact Study Tracking Survey). We believe that gift certificates will be even more effective in this economic climate, because teaching staff are in great need of resources for their programs. Electronic gift certificates have grown in use among the general population as a gift-giving device (Offenberg, 200711). We are aware there is survey research that supports the use of cash over other types of incentives, including non-monetary incentives, as a specific mechanism for increasing response rates; however, research comparing the effectiveness of electronic gift certificates to other types of gift certificates is in its infancy. At least one study did not find a significant difference in response rates between paper and electronic gift certificates (Birnholtz, Horn, Finholt, & Bae, 200412); however the research has not considered electronic gift certificate incentives as an inducement for the Head Start/Early Childhood and Education population. Given the growing utilization of electronic gift certificates among the general public each year, it is probable that they are becoming a more effective incentive. They are certainly more practical than either money or a paper gift certificate; the distribution of either cash or paper gift certificate incentives would be very impractical in our current design (for each incentive, there would be an estimated .50 amount in staff hours and $0.44 amount in mailing). We also believe the gift certificates are pragmatic within this relatively small evaluation, and are necessary because of the quick start-up of the evaluation, and the relatively short amount of time remaining in the ELMC grant period in which to complete all data collection activities. Our incentive approach is not only to help with response rate, but also to encourage rapid responsiveness and to serve as a thank you to the Head Start and Early Head Start teaching staff and mentor-coaches for their valuable time.


The evaluation team believes that the use of monetary incentives above and beyond the methods that will be used to increase response rates (see Part B) is justified for a number of reasons:

  • Avoidance of non-response bias – the evaluation team is concerned with maximizing response rates because the purpose of the evaluation is to describe implementation, which requires obtaining a census of all grantees and mentor-coaches and requires obtaining as high response rates as possible on the selected sample across all of the important stratification variables (see Part B).

  • Improved coverage of specialized respondents – the evaluation team is collecting data from respondents that work directly with low-income families and children, and some of the programs are migrant or seasonal or serve other low-incidence populations such as American Indian/Alaska Native. These populations are typically excluded from evaluation studies, but will be selected with certainty for the ELMC evaluation (see Part B). The use of a monetary incentive should help to maximize responses from these respondents, especially because the monetary incentive is approximately equivalent to the level of effort required for their response.

  • Decreased amount of time spent doing follow-up with non-respondents – the evaluation team plans to conduct follow-up outreach with non-respondents; however, the use of a monetary incentive may decrease the amount of time spent in follow-up activity, thereby decreasing the overall costs of the information collection.


A10. Assurance of Confidentiality Provided to Respondents

All persons completing the grantee and mentor-coach census surveys and all persons interviewed by telephone for the ELMC evaluation will be assured that the information they provide will not be released in a form that individually identifies them. When the respondent clicks on the link to access a survey, respondents will read a consent screen (see page 1 of attachments 1 and 2). When the respondent begins a telephone interview the consent information will be read to the respondent at the beginning of the interview and the interviewer will ask for verbal consent (see page 1 of attachments 3 and 4). Participants will be told that their conversations will be kept private to the full extent allowable by law, and that individually identifying information will not be attached to any public reports or data supplied to the U.S. Department of Health and Human Services or any other researchers.


To ensure that the data collected are not available to anyone other than authorized project staff, a set of standard confidentiality procedures will be followed:

  • All research evaluation staff will sign an assurance of confidentiality;

  • All research evaluation staff will keep completely confidential the names of all respondents, all information or opinions collected during the course of the study, and any information about respondents learned incidentally;

  • Reasonable caution will be exercised in limiting access to data collected only to persons working on the project who have been instructed in the applicable confidentiality requirements for the project (this includes following encryption requirements for electronic data and providing secure locations, with limited access, for hardcopy of data);

  • The Project Director will be responsible for ensuring that all contractor personnel involved in handling data on the project are instructed in these procedures and will comply with these procedures throughout the study; and

  • The Project Director will ensure that the data collection process adheres to provisions of the U.S. Privacy Act of 1974 with regard to surveys of individuals for the Federal government.


During the evaluation, all necessary information and documents will be kept in locked file cabinents (for hard copies) or pass-word protected files (for electronic copies) accessible only by project staff under the supervision of the Project Director. After the project is completed, the contractors will destroy all identifying information.


A11. Justification for Sensitive Questions

There are no personally sensitive questions in this data collection.


A12. Estimates of Annualized Burden Hours and Costs

This proposed information collection does not impose a financial burden on respondents. Respondents will not incur any expenses other than the time spent responding to each of the six data collection protocols.


Exhibit 2 summarizes the estimates of reporting burden for each of the four data collection protocols submitted for OMB clearance: (1) grantee census survey; (2) mentor-coach census survey; (3) mentor-coach telephone interview; and, (4) staff telephone interview. The total estimated annual burden is derived from the total number of completed data collection protocols and the minutes taken to complete each of the data collection protocols.


Exhibit 2. Time Burden for Respondents


Information Collection Activities

Instrument

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Average Hourly Wage

Total Annual Burden Hours

Estimated Monetary Cost of Burden

Grantee Census Survey

130

1

0.5

$28.12

65

$1,827.80

Mentor-Coach Census Survey

400

1

0.5

$21.74

200

$4,348.00

Mentor-Coach Telephone Interview

65

1

1.0

$21.74

65

$1,413.10

Staff Telephone Interview

130

1

1.0

$16.61

130

$2,159.30

Annual Estimate

460

$9,748.20


As shown in Exhibit 2, the total estimated burden for the ELMC evaluation is expected to be 460 hours, and the total estimated cost to respondents for participating in the ELMC data collection protocols is $9,748.20 ($1,827.80 for grantee admninistrative staff, $5,761.10 for mentor-coaches, and $2,159.30 for teaching staff). To compute the total estimated annual cost, the total burden hours were multiplied by the average hourly wage for education administrators, preschool and child care center/program ($28.12/ hour), preschool and kindergarten teachers ($21.74/ hour), and preschool teachers, except special education ($16.61/ hour) (Bureau of Labor Statistics, National Compensation Survey, 2009).


A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

Not applicable.


A14. Annualized Cost to the Federal Government

The annual costs, for the period of data collection, would be equivalent to $411,929 ($51,491 per month).


A15. Explanation for Program Changes or Adjustments

This submission to OMB is a new request for approval.


A16. Plans for Tabulation and Publication and Project Time Schedule

A16.1 Analysis Plan

The analyses will include a range of descriptive, thematic, and multivariate approaches, depending on the research question being addressed and the types of data being analyzed. All analyses will include reporting non-response (missing data) and potential resulting bias. The analyses will include univariate descriptive statistics of quantitative data (i.e., central tendency, range, variance, frequencies) to describe how the mentor-coach models were implemented. Qualitative thematic analysis will be used for text data from open-ended survey responses and interview data. Although only a minor part of the evaluation and one of the ten research questions, we will also conduct inferential statistics as appropriate for the types of data collected (i.e., categorical, correlational, and regression analyses) that link the grantee census survey data (independent variable) with relevant mentor-coach census survey data (outcome variables) to elucidate relationships between variations in grantees’ approaches and mentor-coach practices.


Nine of the ten evaluation research questions are mostly exploratory in nature (rather than hypothesis testing or confirmative types of questions), and focus on describing the ELMC initiative. In addition to describing the ELMC initiative, as detailed in the justification of this data collection effort in Section A, a secondary objective of the evaluation team is to use the collected data to create a model of mentor-coaching, potential mentor-coaching typologies, and articulat testable hypotheses about the associations between grantee, mentor-coach, and staff variables that could be used for framing future programming and empirically examined in another study. We plan to use both the survey and interview data to say what the ELMC initiative looks like, from the perspectives of multiple respondents, triangulating the various types and sources of data in our analysis.


For the survey data we plan to conduct primarily descriptive analysis methods, producing univariate statistics (e.g., distributions, central tendencies, frequencies, proportions). To address the research question that seeks to answer the role that different features played in the implementation and perceived success of the mentor-coaching, we will also conduct categorical and inferential statistics, as possible and appropriate for the data, to explore possible associations across a range of variables between grantees, mentor-coaching approaches and mentor-coach characteristics. For the interview data and open-ended survey data, we will conduct thematic analysis to identify and report patterns within the qualitative data. This includes enumerating (e.g., counting categories) and typology (e.g., classifying categories) analytical approaches. Thematic analysis will be used, across the full interview data and for open-ended survey data responses to discover and categorize data that represent predominant and important themes that will provide information about the experiences and meanings of ELMC grantees, mentor-coaches, and program-level staff. Thematic analyses will use a semantic approach, where themes are identified based on the explicit meanings of the data-- in other words what has been explicitly said in the interview data.


16.2 Time Schedule and Publications

The estimated time schedule for fielding, analyzing and reporting the data findings for the cross-site evaluation of the ELMC grants is as follows:


Grantee Census Survey (November 2011 – February 2012)

Mentor-Coach Census Survey (November 2011 – February 2012)

Mentor-Coach Interview (December 2011 – February 2012)

Staff Interview (December 2011 – February 2012)

Data Analyses (February 2012 – June 2012)

Report Development (March 2012 – September 2012)


One substantive report will document the results of the data analyses conducted on the data collected from the surveys and telephone interviews. This report will include the conceptual model that guided the evaluation, the research questions, a description of the study design, a description of the methodology and measures, and the analyses, results, and conclusions. This report is currently expected to be submitted in final version to ACF in September 2012.


A17. Reason(s) Display of OMB Expiration Date is Inappropriate

Not applicable. All instruments for the Evaluation of the ELMC Grants will display the expiration date for OMB approval.


A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary or requested for this information collection.

1 Note: One grantee returned its grant money to the Office of Head Start so the grantee population is subsequently 130.

2 For example, see Gormley, Phillips, & Gayer (2008); Camill, Vagras, Ryan, & Barnett (2010); Heckman, Moon, Pinto, Savelyev, & Yavitz (2010).

3 Bogard & Takanisi (2005); Zaslow & Martinez-Beck (2006)

4 Edwards, 2003

5 For example, see Miller (1994); Neuman & Cunningham (2009); Villar & Strong (2007); Rudd, Lambert, Satterwhite, & Smith (2009).

6 For example, see Biancarosa, Bryk, Greenberg, Cor, Haertel, Fountas, Pinnell, Scharer, & Dexter (2010); Powell & Diamond (2009); Powell, Diamond, Burchinal, & Koehler (2010); Fukkink & Lont (2007).

7 Zaslow (2009)

8 Szele´nyi, K., Bryant, A. N., & Lindholm, J. A. (2005) What Money can Buy: Examining the Effects of Prepaid Monetary Incentives on Survey Response Rates Among College Students. Educational Research and Evaluation 11(4), 385 – 404.

9 Brick, J.M., Hagedorn, M.C., Montaquila, J., Brock Roth, S., Chapman, C. (2006). Impact of Monetary Incentives and Mailing Procedures: An Experiment in a Federally Sponsored Telephone Survey (NCES 2006-066). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

10 Kay, W.R., Boggess, S., Selvavel, K., & McMahon, M.F. (2001). The Use of Targeted Incentives to Reluctant Respondents on Response Rate and Data Quality. 2001 Proceedings of the annual Meeting of the American Statistical Association.

11 Offenberg, J.P. (2007). Markets: Gift Cards. Journal of Economic Perspectives, 21, 227-238.

12 Birnholtz, J. P., Horn, D. B., Finholt, T. A., & Bae, S. J. (2004). The effects of cash, electronic, and paper gift certificates as respondent incentives for a web-based survey of technologically sophisticated respondents. Social Science Computer Review, 22(3), 355-362.

Part A: Justification A-12

File Typeapplication/msword
File TitleAbt Single-Sided Body Template
Authorbartlets
Last Modified Bybbarker
File Modified2011-11-15
File Created2011-11-15

© 2024 OMB.report | Privacy Policy