Supporting Statement for NSDL Customer Satisfaction Survey

NSDL Supporting Statement 3145-0157 10-2016.docx

National Science Foundation Surveys to Measure Customer Satisfaction

Supporting Statement for NSDL Customer Satisfaction Survey

OMB: 3145-0157

Document [docx]
Download: docx | pdf

Supporting Statement for Paperwork Reduction Act Submission for 3145-0157


National Science Digital Library/Distributed Learning (NSDL) Customer Satisfaction Survey



A. JUSTIFICATION


  1. CIRCUMSTANCES MAKING COLLECTION OF INFORMATION NECESSARY



On September 11, 1993, President Clinton issued Executive Order 12862, “Setting Customer Service Standards,” which clearly defined his vision that the Federal agencies will put the public first. To accomplish this, President Clinton called for a “revolution within the Federal government to change the way it does business.” He expected this process to require continual reform of government practices and operations to the end that, “when dealing with the Federal agencies, all people receive service that matches or exceeds the best service available in the private sector.”


Section 1(b) of this E.O. requires agencies to “survey customers to determine the kind and quality of services they want and their level of satisfaction with existing services” and Section 1(a) requires agencies to “survey front- line employees on barriers to, and ideas for, matching the best in business.” These Presidential requirements established an ongoing need for the National Science Foundation (NSF) to engage in an interactive process of collecting information and using it to improve program services and processes.


As efforts and investment in new forms of digital learning increase, findings from this information collection will provide a source of information to inform the improvement of resources, services, processes and sustainability of on-going and new cyberlearning initiatives. For this purpose, the International Organization of Standardization’s definition of usability is used. Usability is “the extent to which a product can be used by specific users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” (1998).


The National Science Digital Library/Distributed Learning (NSDL) program began in 2000 in the Directorate of Education and Human Resources (EHR). Between 2000 and 2011, more than 250 grants were awarded to support the development of resources, collections, and technical tools and services for teachers and learners in Science, Technology, Engineering, and Mathematics (STEM). The program’s mission was to lay the foundation for a distributed digital library that would be widely available to the public through websites and would facilitate improvements in teaching and learning in STEM for all. The goals of the program were (1) to create collections and develop services to promote the use of STEM education resources and materials; (2) to conduct research on innovative uses of digital content and services; and (3) to promote community-based processes for digital library development and classroom implementation.


New Data Collection. As a component of efforts to review and improve program services, resources, and tools, a usability experimental simulation was employed to assess satisfaction associated with the use of NSDL. Comparing information retrieval outcomes associated with NSDL to those associated with Google (e.g., critical competitor), the experiment tracked the information retrieval strategies of the participants (e.g., undergraduates and master’s students in education and library science) during the experimental simulation as they located science education resources and materials (e.g., middle school earth and physical science lesson planning). The effort associated with this data collection assessed user satisfaction and documented dissatisfaction in terms of authoritativeness, relevant resources found, presentation of search results, and use of metadata.



  1. HOW, BY WHOM, AND PURPOSE FOR WHICH INFORMATION IS TO BE USED


This effort was based on work by a team of researchers from Guardians of Honor LLC (GOH) and the Justice, Infrastructure, and Environment Division of the RAND Corporation that designed the experimental simulation for use by undergraduate and master’s students in education and library sciences who were asked to conduct a set of search tasks to find materials to plan a lesson for eighth grade science students. The usability experimental simulation is used to document user satisfaction with the ways NSDL supports teachers’ planning efforts. This effort seeks to answer three overarching questions that focus on user satisfaction with the system.


  • How and how well do the NSDL portal and the pathways support teachers’ efforts to find appropriate STEM resources for use in the classroom?


  • Is the success in finding STEM resources affected by: (a) the user’s starting point (in NSDL.org versus a grade-level-specific or content-domain pathway), (b) the type of resource sought, or (c) content domain?


  • How does success in finding resources using NSDL sites compare with use of a general search engine (Google)?


In FY 2012 NSF announced its decision to cease funding the NSDL program beyond the 2012 fiscal year. This effort focused on lessons learned that could be useful to other programs and specifically to identify attributes of sustainable digital initiatives associated with user satisfaction and to assess the extent to which the NSDL program is successful in demonstrating these attributes. Findings will be of interest to other programs at NSF and other organizations that fund related research and implementation programs in cyberlearning (e.g., Department of Education, private non-profit institutions) as well as the digital library community.


  1. USE OF AUTOMATION


The usability experimental simulation protocol is computer-based with each participant using a computer and completing the task independently. The instructions, tasks, and responses are presented and collected online through a single SurveyMonkey interface. Participants are asked to imagine that they are eighth grade science teachers seeking good online resources with which to build a lesson plan that introduced students to plate tectonics (earth sciences) or Newton’s laws of motion (physical sciences). They are subsequently led through the search tasks starting with a practice task to gain familiarity with the website, followed by four experimental tasks. Participants are told that while they should start at their assigned website, the resources they might find could be from linked websites. After finding resources that the participants deem appropriate, they can copy the URLs for each web page to the outline form and provided a label for each page. The tasks are the same in both domains and assigned in the same order.


Participants are given a fixed amount of time, ranging from 5 to 8 minutes, for each task. To more closely simulate the goals and constraint of real teachers – the need for high-quality resources and limited time to find them – participants are encouraged to find good resources and to provide best examples, and given the option of indicating that they found no suitable resources. At the end of each experimental task, participants are asked to complete a set of questions asking them about the specific type of experimental search task. Participants are required to remain on task until the full time expired. The procedure required approximately one hour.



  1. EFFORTS TO IDENTIFY DUPLICATION


This data collection is based on a pilot trial and does not duplicate other NSF efforts. No other agency has an R & D program to establish a science digital library/distributed learning system.


  1. SMALL BUSINESS CONSIDERATIONS


Not applicable.


  1. CONSEQUENCES OF LESS FREQUENT COLLECTION


Not applicable.


  1. SPECIAL CIRCUMSTANCES FOR COLLECTION


Not applicable.

  1. FEDERAL REGISTER NOTICE


The agency’s notices, as required by 5 CR 1320.8(d), were published in the Federal Register on February 25, 2014, at 79 FR 10574 and May 30, 2014 at 79 FR 31145 and no comments were received.


OUTSIDE CONSULTATION


The estimation of burden is based on the pilot experimental simulation conducted by Dr. Grace Agnew, Library Science, Rutgers University.


  1. GIFTS OR REMUNERATION


Not applicable.


  1. CONFIDENTIALITY PROVIDED TO RESPONDENTS


To provide participant confidentiality, the following questions appear at the beginning of the web-based survey instrument that was revised after the completion of the pilot study.


  • Please indicate that you understand that you cannot be identified as a participant in this survey. If you select “no,” you cannot participate further in the survey,

    • Yes, I understand that my participation is anonymous.

    • No, I don’t understand (this response concludes your participation.


  • Some questions provide opportunities for open-ended comments that may be used in the report to the National Science Foundation and its consultants, Guardians of Honor and the RAND Corporation, or in scholarly papers or presentations. Please indicate how quotes from your individual survey may be used. These quotes will be anonymous since participants cannot be identified.

    • Please quote verbatim, using quotation marks at the beginning and ending of the quotation.

    • Please paraphrase my response(s) rather than quoting my response(s) directly.


  • You will need to be eighteen years of age or older to participate in the survey. Please indicate your broad age range. This information is used only to determine eligibility to participant in the survey and is not retained or analyzed.

    • 18 years of age or older.

    • 17 years of age or younger (this concludes your participation).


  • Participants need to be engaged in a degree program that is relevant to the National Science Digital Library. Please select your degree program below. If more than one applies, please select your primary or most current degree program.

    • Master of Library and Information Science

    • Ph.D. Library and Information Science

    • Ed.M. Program in Biological Science Education

    • Ed.M. Program in Mathematics Education

    • 5-year Teacher Education Program: Mathematics (K-12)

    • Ed.M. program in Physical Science or Physics Education

    • Ed.D. Program, Mathematics Education

    • Ph.D. Program, Mathematics Education

    • PhD. Program, Science Education

    • I am not engaged in a current degree program (this response concludes your participation).


Participation in this study is voluntary. You may choose not to participate by exiting the survey at any time. Incomplete surveys will not be retained or analyzed.


  1. QUESTIONS OF A SENSITIVE NATURE


No questions of a sensitive nature are asked.


  1. ESTIMATE OF BURDEN


Respondent Type

# of Respondents

Time per Response (Hours)


Total Time Burden (Hours)

Students (graduate & undergraduate)

170

1.0

170





Total

170

1.0

170


ANNUALIZED COST TO RESPONDENTS

Respondent Type

Hourly Salary1

Burden Time

Per

Respondent Hours)

Estimated Cost to Respondent

Number of Respondents

Estimated Annual Cost

Students

$16.35

1.0

$16.35

170

$2,779.50







Students

$16.35

1.0

$16.35

170

$2,779.50



  1. CAPITAL/STARTUP COSTS


Not applicable


  1. ANNUALIZED COST TO THE FEDERAL GOVERNMENT


Federal Employee

Hourly Salary2

Burden Time

Per Employee Hours)

Estimated Cost to Respondent

Number of Employees

Estimated Annual Cost

Program Officers

$80.00

15.0

$1,200

3

$3,600.00







Students

$80.00

15.0

$80.00

3

$3,600.00



  1. CHANGES IN BURDEN


Not applicable. The survey burden time fits within the allowance of the generic clearance.


  1. PUBLICATION OF COLLECTION


Not applicable.


  1. SEEKING APPROVAL TO NOT DISPLAY OMB EXPIRATION DATE


Not applicable.


  1. EXCEPTION(S) TO THE CERTIFICATION STATEMENT (19) ON OMB 8-1


Not applicable



B. STATISTICAL METHODS


B.1. Universe and Sampling Procedures

Participants are students (primarily graduate and advanced undergraduate students in education or library science) from universities. For the convenience sample, participants are recruited by email messages posted to lists of target student populations and through contact with relevant faculty asked to publicize the study and encourage participation. As a result, an accurate estimate of participation rates was not made because there was no estimate of the total number of participants who were contacted.


B.2. Survey Methodology


The experimental simulation is designed around a hypothetical lesson plan development effort for eighth grade science. The study uses a 2 (domain) X 4 (starting point) X 4 (task) design. Domain and starting point are between-subjects factors (i.e., participants are assigned to only one domain and starting point), and the task is within subjects (i.e., all participants complete all four tasks). There are 20-23 participants in each domain X starting point condition.


Domain. Participants are randomly assigned to one of two domains, plate tectonics or physical science with a focus on Newton’s laws of motion. These units were selected based on several criteria. The unit is a part of a course that is typically required for middle school students. The unit is associated with learning objects that are typically a part of states’ science standards for that course. There is an NSDL collection pathway dedicated to that course unit. There are sufficient and diverse resources addressing that unit in the chosen pathway and in the main portal. The first two criteria help to ensure that the experimental stimulation broadly represented the needs and practices of middle school teachers. The last two ensured that the topic(s) were represented (e.g., feasible and appropriate).


Starting point. Participants will be randomly assigned to one of four starting points for their searchers: NSDL portal; a domain-specific site, a grade level specific site, or a general search engine. Each participant will be asked to complete search tasks within their assigned domains.


Tasks. Each participant completed four search tasks: (1) static text and images; (2) animations or videos; (3) interactive classroom activities; and (4) inquiry-based learning activities, ideally involving hypothesis testing or active or experiential learning.


Procedure. The study follows the pilot trial, which was based on usability frameworks developed in the literature and on the study’s task framework and design decisions. The protocol was computer-based with each participant completing the tasks independently. Participants were asked to imagine that they were eighth grade science teachers seeking good online resources. They were subsequently led through the search tasks, starting with a practice task to gain familiarity with the website, followed by the four experimental tasks. After finding online resources that the participants deem appropriate, they copied the URLs for each web page to the online form and provided a label for each page. They then were asked a set of questions about the search task. The tasks were the same in both domains and were assigned in the same order. Participants were given a fixed amount of time, ranging from 5 to 8 minutes for each task, to more closely simulate the goals and constraints of real teachers. Overall, the procedure required approximately one hour.


The number of websites that are found were analyzed using repeated measures analysis of variance (ANOVA), reported over the four tasks.3



B.3. Methods to Maximize Response


Achieving strong response rates begin with a well-designed and user-friendly instrument. For this web-based survey, the instrument was revised based on comments from students who participated in the pilot study and provided information on how to improve it during focus groups.


In addition, the web approach allowed us to screen participants during the initial stage of the experimental simulation, using the following screening questions.

  • You will need to be eighteen years of age or older to participate in the survey. Please indicate your broad age range. This information is used only to determine eligibility to participant in the survey and is not retained or analyzed.

    • 18 years of age or older.

    • 17 years of age or younger (this concludes your participation).


  • Participants need to be engaged in a degree program that is relevant to the National Science Digital Library. Please select your degree program below. If more than one applies, please select your primary or most current degree program.

    • Master of Library and Information Science

    • Ph.D. Library and Information Science

    • Ed.M. Program in Biological Science Education

    • Ed.M Program in Mathematics Education

    • 5-year Teacher Education Program: Mathematics (K-12)

    • Ed.M. program in Physical Science or Physics Education

    • Ed.D. Program, Mathematics Education

    • Ph.D. Program, Mathematics Education

    • PhD Program, Science Education

    • I am not engaged in a current degree program (this response concludes your participation


B.4. Testing Procedures


The study team worked from a vetted logical model, which was informed by guidance from several NSF program officers and experts in digital libraries/distributed learning. The study team tested the experimental simulation during a pilot trial in which 30 pre-service teachers conducted search tasks for materials to plan a lesson on plate tectonics for eighth graders. Discussion with study participants also pointed to recommendations for a larger study, such as providing participants with a practice session to get acquainted with their starting site.


B.5. Contacts for Statistical Aspects of Data Collection

  • Dr. Grace Agnew, Associate University Librarian, Rutgers University Libraries

  • Dr. Susan G. Straus, RAND Corporation, Justice, Infrastructure, and Environment




1 Salary estimates are based on education and profession figures provided by the U.S. Department of Education, National Center for Education Statistics, (2010). The Condition of Education 2010 (NCES 2010-028).

2 Salary estimates are based on education and profession figures provided by the U.S. Department of Education, National Center for Education Statistics, (2010). The Condition of Education 2010 (NCES 2010-028).

3 We analyzed the number of URLs using a 2(domain) X 4 (starting point) repeated measures ANOVA, repeated across tasks. Degrees of freedom for within-subjects effects were adjusted for non- sphericity using a Huynh-FeldtLecoutre correction. Post hoc paired comparisons were conducted using a Tukey correction.

10


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorConnie Della-Piana
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy