Documentation for the Generic Clearance of Customer Service Satisfaction Collections

DWW CSSS SS.docx

Master Generic Plan for Customer Surveys and Focus Groups

Documentation for the Generic Clearance of Customer Service Satisfaction Collections

OMB: 1800-0011

Document [docx]
Download: docx | pdf

DOCUMENTATION FOR THE GENERIC CLEARANCE
OF CUSTOMER SERVICE SATISFACTION COLLECTIONS

TITLE OF INFORMATION COLLECTION:

Doing What Works Initiative: User Feedback Survey

[ x ] SURVEY [ ] FOCUS GROUP [ ] SOFTWARE USABILITY TESTING

DESCRIPTION OF THIS SPECIFIC COLLECTION

  1. Intended purpose

Doing What Works (DWW) is a major initiative of the U.S. Department of Education that translates research-based practices into examples and tools to support improvements in practice. DWW represents collaboration across the Department in which research reviews are conducted under the Institute of Education Sciences (IES) to identify research-based recommendations. The translation of these practices into materials to support application is then carried out under the Office of Planning, Evaluation, and Policy Development (OPEPD).

The information on the DWW website is organized around a Learn-See-Do structure: Learn what the research says through media and interviews with experts; See how successful sites have used these practices through videos, interviews, and sample materials; Do what works by taking action with help from DWW downloadable tools and resources.

The website can be accessed directly by districts and schools. District administrators, school principals, coaches, teachers, and specialists can access and download free resources from the Learn What, See How, or Do What Works pages. In addition, they can read Ideas for Action tailored to users’ roles and use hyperlinks to access related resources.

The purpose of the survey is to measure users’ satisfaction with the DWW website. Using this survey, we will gather feedback about the quality, relevance, utility, and application of the DWW website. Specifically, website quality information will be collected on the extent to which educators find the site and its materials easy to use, high quality, and addressing current needs for educational resources. Relevance, utility, and application information will be collected on the extent to which educators find the materials useful to their everyday practice and have used the materials to help implement the evidence-based practices. The survey will also assess public awareness of the DWW website, such as how educators find and explore the website.



  1. Need for the collection

Each year, the U.S. Department of Education releases a number of Practice Guides that compile research evidence and expert knowledge to provide educators with the most up-to-date knowledge about potentially promising educational practices. To help educators and administrators make best use of those practice guides, the Doing What Works website provides resources that educators can use, especially through professional development, to:

  • Understand the research base in a given topic area;

  • Deepen their understanding of practices that the research finds to be effective at improving student achievement;

  • See examples of those practices in action in schools; and

  • Examine and modify their own practices through the use of tools and self-assessments.



However, in the absence of systematic data collection of website users, we cannot tell if the website is meeting the needs of its audience. Specifically, this evaluation aims to address the following questions:

  1. How do DWW users access and use DWW resources?

  2. Do users find the website useful for their work? What characterize those users in terms of role, topics of interest, products used, and purposes of use?

  3. What can be done to make the website more relevant to users' work?



  1. Planned use of the data

Results from the user feedback survey will help better understand the needs of the DWW website user for online educational resources. Such user feedback will be used to improve the quality of the website by designing and providing the type of materials and tools that educators need to support their implementation of evidence-based practices.

  1. Date(s) and location(s)

We plan to start the survey on April 15, 2010, and end it on April 15, 2011, or after 500 users have responded to the survey. The survey will be web-based and managed by the American Institutes for Research.

  1. Collection procedures

We will use the least intrusive way to recruit survey respondents. First, to inform the hundreds of daily visitors to the DWW website about the survey opportunity, a pop-up window with an invitation to take the survey will appear along with the following three options when users are browsing the website during the survey period: (a) Take the survey now, (b) Ask me again later, and (c) Do not ask me again. Based on user's responses, the window will appear again only for those selecting option (b). If users select to close the pop-up window without selecting an answer, the pop-up window will not appear again.

Second, a link to the web-based survey will appear in the "DWW News" section on the homepage of the website. Third, we will announce the launch of the survey to DWW updates subscribers.

In addition, we will also inform DWW users of the survey opportunity through emails to individuals who have participated in DWW presentations in conferences and Regional Education Laboratory trainings. A final communication method for reaching out to website users is through networking websites. We will post brief messages regarding survey opportunities on education blogs inviting those who have used the website and its resources in the past to take the survey and provide feedback.

  1. Number of focus groups, surveys, usability testing sessions

One user feedback survey will be administered.

  1. Description of respondents/participants.

The proposed user feedback survey is a one-time data collection activity. Potential survey respondents are teachers, coaches, administrators, professional development providers, technical assistance providers and individuals in related professions who visit the website. The extent to which the users are familiar with the website may vary on a continuum ranging from first-time users to frequent users. The survey includes 10 questions. However, those who indicate that this is their first visit to the website will answer only 4 of those questions, as they may have not explored yet the website content or how it could be useful for their work.

The data will be collected during a period of 12 months or until 500 users have responded to the survey. The first summary of the survey results will be completed by July 31, 2010. A copy of the survey is attached.

AMOUNT OF ANY PROPOSED STIPEND OR INCENTIVE

No stipend or incentive will be provided for completing the survey.

BURDEN HOUR COMPUTATION (Number of responses (X) estimated response or participation time in minutes (/60) = annual burden hours):

The survey takes 6 minutes to complete. The estimated burden is 50 hours across 500 respondents.

Category of Respondent

No. of Respondents

Participation Time

Burden

User Feedback Survey

500

6 minutes

50 hours

Total

500

6 minutes

50 hours


BURDEN COST COMPUTATION

The average hourly cost per respondent is estimated to be $30.00. No stipend or incentive will be provided for completing the survey.

Category of Respondent

No. of Respondents

Hourly Rate

Response Time

Total

User Feedback Survey

500

$30

50 hours

$1,500

Total

500

$30

50 hours

$1,500


STATISTICAL INFORMATION

If statistical methods are to be used, fully describe the methodology, sample selection, expected response rates, and any other concepts needed to provide a full understanding of those methods.

The respondents of the DWW user feedback survey will be DWW users who visit the DWW website during the survey window and who are willing to take the survey, as well as those respondents who are recruited through any of the methods described in Section 5, Collection Procedures. We will analyze the survey data using descriptive statistics (e.g., percentages, means, standard deviations) to provide an overall picture of user satisfaction and to identify differences in responses by user groups. For example, survey data can be analyzed to determine the percentage of participants who find the website materials useful or very useful, or the average level of usefulness of the website across different user groups (e.g., teachers, administrators, professional development providers). Survey data can also be analyzed to examine how user needs and perceptions of the website differ on the basis of the frequency of user visits to the website.

REQUESTED APPROVAL DATE: April 20, 2010

NAME OF CONTACT PERSON: Jennifer Ballen Riccards

TELEPHONE NUMBER: 202-205-4274

MAILING LOCATION: 400 Maryland Ave. SW, Room 6W256, Washington, DC, 20202

ED DEPARTMENT, OFFICE: U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Program Studies Service

Page 4 of 4

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCindy Cai
File Modified0000-00-00
File Created2021-02-04

© 2024 OMB.report | Privacy Policy