Generic Memo

EEP OMB -ED format v2.docx

Master Generic Plan for Customer Surveys and Focus Groups

Generic Memo

OMB: 1800-0011

Document [docx]
Download: docx | pdf

Appendix D - Generic Clearances

DOCUMENTATION FOR THE GENERIC CLEARANCE

OF CUSTOMER SERVICE SATISFACTION COLLECTIONS


Shape1

TITLE OF INFORMATION COLLECTION: Evaluation of the Relevance and Utility of National Center for Education Evaluation (NCEE) Products: Experts Bring Evidence to Practitioners (EEP) Events (10-day review request)


[X ] SURVEY [ ] FOCUS GROUP [ ] SOFTWARE USABILITY TESTING


DESCRIPTION OF THIS SPECIFIC COLLECTION


  1. Intended Purpose


This submission is a request for approval of data collection activities that will support an evaluation of research dissemination events conducted by Regional Educational Laboratories (RELs) and sponsored by the U.S. Department of Education’s Institute of Education Sciences (IES) through its National Center for Education Evaluation and Regional Assistance (NCEE). Experts bring Evidence to Practitioners, or EEP, events are designed to bring to practitioners the latest findings from IES evaluation research and reviews of “what works.” The objective of the evaluation is to estimate the extent to which the EEP events are perceived as relevant and useful to the attendees. At the events, IES evaluation experts summarize the findings contained in an IES report and provide practitioners with a chance to hear expert reviews or summaries of rigorous new studies testing alternatives important to those concerned with school improvement.


IES is conducting the evaluation as part of a larger contract entitled the Analytic and Technical Support for Advancing Education Evaluations, hereafter referred to as ATS. Mathematica Policy Research, Inc., (MPR) and its subcontractor CommunicationWorks are implementing the evaluation.


In this package, we are requesting approval to conduct three data collections:


  • A list collection of names of EEP event registrants from the REL sponsoring each event

  • A web survey of EEP attendees conducted one week after the event

  • A protocol for semi-structured telephone interviews of a sample of attendees conducted six months after the event


  1. Need for the Collection


Under the IES authorization legislation, the Education Sciences Reform Act of 2002, Section 171 (b) states that the NCEE mission shall be “To provide technical assistance; To conduct evaluations of Federal education programs administered by the Secretary (and as time and resources allow, other education programs) to determine the impact of such programs (especially on student academic achievement in the core academic areas of reading, mathematics, and science); To support synthesis and wide dissemination of results of evaluation, research, and products developed; and to encourage the use of scientifically valid education research and evaluation throughout the Unites States”.


The evaluation is essential to identifying the extent to which RELs are disseminating syntheses of findings from NCEE-supported research and determining that the information is perceived as relevant and useful to its intended audience, specifically education practitioners, researchers, and policymakers. The goal of the Post–EEP Event Survey is to collect information from attendees within one week of each event to assess respondents’ perceptions immediately following their experience (Appendix A). The Followup EEP Event Survey will be conducted approximately six months after the event to learn whether attendees have used the information presented at the event, in what context, and for what purpose (Appendix B). Additional analyses will determine if presentation formats or product factors are associated with greater dissemination and use and whether any participant characteristics are correlated with perceptions of usefulness or reported usage.


The primary research questions of the two surveys ask:


  • To what extent do attendees at REL-sponsored EEP events perceive the information presented as relevant and useful to their work immediately after attending the event?

  • To what extent do EEP event attendees report the use and sharing of information in the six months following the event?

The evaluation also will address the following secondary research questions:


  • To what extent do EEP attendees differ in the type of work that they do, their use of education research, and their reported knowledge of the topic presented at the EEP before attendance?

  • How does the perceived relevance and reported use of information vary by attendees’ job type and prior knowledge?

  1. Planned Use of the Data


The ATS study’s data collection will give ED and IES useful information for effectively targeting and meeting the needs of NCEE stakeholders. The survey data will permit IES and NCEE to determine the materials that EEP event attendees find most effective and beneficial, how NCEE-supported research and products are used, and identify research and products needed in the future. The data will allow NCEE to better serve the informational needs of its target audiences by bringing the latest and best research and proven practices into school improvement efforts, especially in reading, mathematics, and science.


  1. Date(s) and Location(s)


Data will be collected for EEP events across the country. The timeline for this study is shown in the following table:

Schedule of Activities

Activity

Schedule

Collect names of EEP attendees

April 2009–April 2010

Conduct Post–EEP Event Survey

April 2009–April 2010

Conduct Six-Month Survey

October 2009–October 2010

Analysis and report of findings

June 2011



  1. Collection Procedures


To answer the primary and secondary research questions, MPR will first develop a sampling frame of EEP event attendees. Before each event, MPR will contact the REL sponsoring the event and request a list of registrants’ names, email addresses, and telephone numbers. One week after the EEP event, MPR will email all attendees an invitation to participate in the survey. The invitation will include a link to a short, close-ended, web-based questionnaire. The questionnaire used for the Post–EEP Event Survey will be standard across all events. To improve the response rate, MPR will send a followup email request to non-respondents two weeks after the initial email invitation; three weeks after each event, MPR will make limited followup telephone calls to non-respondents.


Six months after each event, MPR will conduct the Followup EEP Event Survey with a randomly selected subsample of approximately one-third of event attendees regardless of whether they responded to the initial web survey.


The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. To conduct the Post–EEP Event Survey, we will use a web-based data collection method that will be programmed to accept only valid responses and to check for logical consistency across answers. Respondents will thus be able to correct any errors as they complete the survey, minimizing the need for later contacts to obtain missing data or clarify inconsistent data. An added advantage of web-based data collection is that respondents may complete the survey at their convenience. An email invitation sent to EEP attendees will contain a URL link to the web-based survey and a unique user ID and password (Appendix C).

Individuals who choose not to respond to the web-based survey will be able to print a Portable Document Format (PDF) version from the web for faxing or mailing to MPR. Respondents may also request participation through two other modes: (1) by standard mail and (2) by telephone. It is important to offer these other modes of response to make the survey as convenient as possible, thus increasing the response rate. Attendees who have not completed the survey will receive one email reminder encouraging them to respond; the names of subsequent non-responders will be sent to MPR’ Survey Operations Center for telephone followup. For respondents with questions about the study, all email communications will include access to Frequently Asked Questions (FAQ) (Appendix D) along with a project-specific email address and a toll-free telephone number.

  1. Number of Focus Groups, Surveys, Usability Testing Sessions


This request includes two data collections – a web survey conducted one-week after each EEP event, and a followup telephone interview six months after each event. Sample numbers are provided in the table below.

  1. Description of Respondents/Participants


The sample members for the surveys are EEP event attendees. The attendees are typically education practitioners and policymakers employed by federal or state education agencies, professional associations, school districts, colleges or universities, research companies, newspapers or other media producers, and public policy companies.

Copies of the two proposed surveys are attached.


AMOUNT OF ANY PROPOSED STIPEND OR INCENTIVE


No financial incentives or gifts will be offered to respondents.


BURDEN HOUR COMPUTATION (Number of responses (X) estimated response or participation time in minutes (/60) = annual burden hours):


Category of Respondent

No. of Respondents

Participation Time

Burden

Post-EEP Event Respondent

2,640 event attendees

.10 hour

440 hours

Followup EEP Event Respondent

880 event attendees

.50 hour

440 hours

Totals

3520 event attendees


880 hours



BURDEN COST COMPUTATION


There are no cost burdens to respondents.




REQUESTED APPROVAL DATE:


NAME OF CONTACT PERSON:


TELEPHONE NUMBER:


MAILING LOCATION:


ED DEPARTMENT, OFFICE:

PAPERWORK REDUCTION ACT

CHANGE WORKSHEET



Agency/Subagency

U.S. Department of Education/ office name


OMB Control Number


1800-0011 v. #


Enter only items that change

Current Record


New Record

Agency form number(s)





NA

NA

Annual reporting and record keeping hour burden



Number of respondent

70,000

70,000

Total annual responses

70,000

70,000

Percent of these responses collected electronically


80 %


80 %

Total annual hours

25,000

25,000

Difference


0

Explanation of difference




Program change


0

Adjustment

0

Annual reporting and record keeping cost burden (in thousands of dollars)



Total annualized capital/startup costs

0

0

Total annual costs (O&M)

0

0

Total annualized cost requested

0

0

Difference


0

Explanation of difference




Program change adjustment


0

Other: ED is requesting approval of the "title " under the Customer Satisfaction Survey Master Plan, 1800-0011. The burden hours for this individual survey fall within the annual cap for 1800-0011.

Signature of Senior Official or designee:



Date:

For OIRA Use


_________________________________


_________________________________

**This form cannot be used to extend an expiration date

OMB 83-C



1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRachel Holzwart
File Modified0000-00-00
File Created2021-02-04

© 2024 OMB.report | Privacy Policy