Supporting_Statement_NASA_ED

Supporting_Statement_NASA_ED.doc

Generic Web Site Usability Information Collections

OMB: 2700-0129

Document [doc]
Download: doc | pdf

A. Justification

1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

Justification for conducting usability studies goes beyond customer requests or even Agency recommendations. The NASA internal usability site http://www.hq.nasa.gov/pao/portal/usability/guidelines/index.htm provides usability guidelines (http://www.hq.nasa.gov/pao/portal/affinityKit/styleguide/standards_index.pdf) and a link to OMB's document regarding Web guidelines http://www.hq.nasa.gov/pao/portal/usability/resources/OMB_Web_guidelines.pdf and this document states "The recommendations and best practices published by the Interagency Committee on Government Information (http://www.webcontent.gov) will aid your implementation of the policies outlined in the attachment."


One of the best practices listed on the Interagency Committee on Government Information site concerns usability, which is also considered a standard component of iterative design. http://www.firstgov.gov/webcontent/usability/techniques.shtml


Our job is to provide quality information on NASA Education Web sites in a manner that promotes the least amount of customer frustration, effort, and time as possible. NASA uses a Web survey, which was developed in partnership with the Treasury’s Federal Consulting Group and ForeSee, for the overall NASA site, and we garner information pertinent to the portions of the site that we build and maintain; however, not all usability data can be gathered from a survey. It is crucial to test a site’s or product’s usability with actual users and be able to garner feedback after tasks are completed. In this way, the feedback is tailored to the site or product being tested, and provides much better data to the Web team concerning what elements or functions of a site or product need to be revised or added.


There are billions of Web pages, and we’re competing for users’ time. Usability field studies and testing are necessary to ensure our Web sites and educational products are as easy to use as possible and contain the type of information our target audiences seek. Without user data—gained through observation and feedback—we’re basically guessing about site usability and product pertinence. Also, we must be able to test the system to compare what the users say versus what they actually do when completing a task; the two often differ dramatically.

A deep understanding of needs requires field studies of customers.  Observe real customers as they actually use the site. A professional user-experience team will go to the customer's home, office, or place of employment, where they will access the site. Don't trust what customers say--trust what they do.” (Jakob Nielsen and Donald A. Norman, “Usability On The Web Isn’t A Luxury”)


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

Usability data gathered using various methods and resources, including but not limited to those described below, will be used by Web and product design teams to enhance NASA Web sites and educational products, making them easier to use and more effective for users to access Agency information with the least amount of time, frustration, and effort.


Data from all instruments is reviewed and analyzed, and results are compiled into a usability report that is sent to NASA Education management, the NASA Web Editorial Board, and is used by the design team to determine what elements and functionality changes needed to be added or revised to a site or educational product, such as an Educator Guide. Teams at various NASA centers support different sections of the NASA Web site. The NASA Educational Technology Services team at Marshall Space Flight Center will do a large portion of the testing, analyzing, and reporting on usability projects.


Depending on the type of revisions or additions required to a site or product, a project schedule is developed for making revisions/additions. For instance, a total site redesign would take much longer to design and to implement than tweaking the name on a navigational button, which might be done immediately.


Candidate Screener - This is a pre-testing instrument required to screen respondents for suitability to participate in testing the usability of NASA Web sites and resources. Our audience consists of educators and students, with a heavy emphasis on the K-12 community since that group represents our largest sector of users for the Education section of NASA.gov. No personally identifiable information is requested, and the Marshall Space Flight Center Privacy Act Manager has reviewed the screener. The information is not stored in a database, merely collected and reviewed for suitable candidates. Respondents may complete the screener electronically and e-mail or fax a completed screener to NASA [or its contractors]. The information on the screener enables the usability testing team to see if an educator teaches in subject areas that NASA is tasked to support—science, mathematics, technology, and engineering. We also review an educator’s Web and computer experience, striving to test users ranging from novice to expert level. This instrument will be used at the beginning of most usability projects for various sections of NASA.gov; however, we probably would not use it for K-4 students but would rely on K-4 teachers to nominate students for testing, or we would observe a K-4 classroom of students as they worked on a site for general feedback and user behavior information. Completion of this instrument is strictly voluntary, but we find the K-12 audience—particularly educators—eager to provide feedback. Estimated burden for completing the screener is 5 minutes.


User Observation - This technique is required continually to modify and improve various sections and content on NASA.gov. Usability research tells us that what users say versus what they do is often very different. Users are given tasks related to activities that they might normally complete on NASA.gov. Observing users as they complete these tasks provides valuable feedback regarding what issues the users are struggling with as they use the Web site, what paths they choose, how they interact with the site functions and the search feature, and provides insight to the user experience as a whole. Much of the feedback gained through observation is nonverbal, but it’s very telling in terms of body language, gestures, etc. Feedback will help us modify and adjust this resource to meet users’ needs. The information is used by NASA Education and its contractors to effectively maintain and improve various sections on NASA.gov. The usability specialist and select members of the Web design team observe users, take notes about user behavior, and discuss these elements after testing. Information from all users is then reviewed, analyzed, and summarized in a usability report that is presented to NASA Education at Headquarters. If a participant requests a report, he or she is provided a copy. Otherwise, the report is disseminated internally and referenced by team members to improve the Web site. The users are given 20 minutes to complete three tasks online. See the “NASAED_OMB_sampleusability.doc” Word document for samples of user tasks for various audiences.


Focus Groups – This technique is particularly useful to gauge the educational value of online products or educational Web sites. NASA Education staff or contractors would tailor five to six questions for discussion during the focus group session. Educators or students would be invited to review a particular site or educational product—such as an Educator Guide with lesson plans or activities—and then the product’s value would be discussed in a focus group session, which would last 60-90 minutes. The Web design team would be able to interact directly with users during these sessions, and this interaction would provide great information for how NASA should adapt a site or product to improve its pertinence and usability for customers. It is typical in the usability industry for focus group session to be recorded, using audio or video recording equipment. This is useful for the design team to be able to review user comments. Sample questions are provided in the “NASAED_OMB_sampleusability.doc” Word document.


Questionnaires – This data collection instrument is one that helps us gather targeted input from our audience; this is not an online survey or questionnaire. Questions will be usability-related, and will vary according to the product, site, and audience. Data will be collected from teachers and/or students, depending on the project, and will take 5-15 minutes to complete depending on the audience. The point of Web usability is to ensure that a site performs its functions with as little user frustration, effort, and time as possible [The Web Usability Handbook, Mark Pearrow, 2000]. Questionnaires are a critical tool in a usability specialist’s toolkit, because they can be tailored by the design team to gather specific input from our audiences. The team would review the feedback, and the data either would be compiled into a summary report for internal dissemination or included in a final usability report, depending on the project group’s request. See some of the example questions that would be used for questionnaires in the “NASAED_OMB_sampleusability.doc” Word document. These examples are not exhaustive, and may be revised to include additional questions as new functionality or projects are developed. However, the objective of questions will continue to relate to the usability of systems and the educational value of products.


In-person interviews – This collection instrument is used in debriefing usability participants after they have completed usability testing online; respondents provide verbal feedback rather than completing a questionnaire. Questions are related to user experience, site architecture, flow, content and navigation, and recommendations for improvement. The questions are included in the Usability Report, along with a summary of respondents’ input. The Web team uses this information to revise the site and its contents to enable users to access Agency information with the least amount of effort, frustration, and time possible. This instrument is a key component of usability studies, allowing users to provide direct feedback regarding the Agency’s site and content. Time allotted for the interview is 20 minutes.


Consent form – This form provides consent by a usability participant for NASA Education to video record the usability session and to review the recording to gather data that might have been missed during testing. A sample form is provided in the attached Usability Report document. The forms are not stored and referenced online. Time to complete this form is 1 minute.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

Candidates may fax or e-mail the Candidate Screener forms, which may be filled in electronically. The tasks observed during usability testing require no written response from participants. They merely complete the tasks by finding appropriate documents or program information on the NASA.gov Web site. The Consent Form is signed by participants before testing begins. Usability sessions are video recorded so that the Web design team may analyze user behavior and actions for what might have been missed while observing the tests. Minimal time is required by respondents for completing forms. Most data is gathered by the design team.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

No duplication is involved. Usability data cannot be generalized; user feedback for one site may not be relevant to another site, even though both sites may be sub-sites or pages within one large Agency site. User feedback and usability data gathered through testing are the only valid means for ensuring a site meets users’ needs.


5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.

n/a


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

Without conducting usability testing on Agency sites and garnering feedback on Educational products/programs, the Agency is merely guessing whether the Web site(s) meet users’ needs. In order to provide quality information that is easily accessible on a system that is easy to use, usability studies must be conducted.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

Usability studies would be conducted on an as-needed basis per project.

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.

Participation in NASA Education usability studies is strictly voluntary. Depending on NASA Education customer requests for updates or redesign to portions of the NASA.gov Web site, users might be asked to participate in a study less than 30 days after being asked if they would like to participate; however, when possible, NASA Education would schedule studies in advance to provide ample time for participants to plan.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB.

n/a

Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments.

n/a

Specifically address comments received on cost and hour burden.

n/a

Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.

NASA Educational Technology Services (NETS) researched appropriate usability methods recommended by industry usability specialists and researchers. The team devised a plan based on this research, and the plan was reviewed and approved by NASA Education management.


Data collected will be analyzed and compiled into a Usability Report, which will summarize tasks, questions, and responses from users. The report will be compiled by a member of the design/usability team and disseminated internally. See the sample Usability Report included towards the end of this document.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.

Usability experts recommend not using the same users in testing iterations of the same site. Therefore, participants would not be contacted after the first round of feedback was gathered, unless an educator was contacted to see if students who had never participated in our usability studies would like to participate.


9. Explain any decision to provide any payment or gift to respondents, other than reenumeration of contractors or grantees.

n/a


10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

The Consent Form, which the participant signs to allow us to video record the session, notes that the data will be used by NASA Education to make improvements to the Web site for that specific project, and may be referenced by the team (for comparison purposes and review of techniques) for future usability projects.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

n/a


12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates.

The entire usability session and completion of related documents takes half-hour to 90 minutes, depending on the audience being testing. Each of these tools is often used in usability data collections. Breakouts are:

  • Candidate screener—5 minutes

  • Observation of task completion (notes taken by Web team members)—20 minutes

  • Usability questionnaire—5-15 minutes, depending on the audience. Younger students may take longer to complete the form than educators. This form is not always used, such as when an educator is interviewed after being observed rather than filling out a questionnaire.

  • In-person interview—20 minutes; the team uses the last 20 minutes after task completion to gain users’ perspectives on the testing process and the site or product tested.

  • Consent form—1 minute (Parents would sign this form for students.)


Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.


* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 13.

0


13. Provide an estimate for the total annual cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

0


14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.

Estimated usability cost per year, based on trips to four regions in the U.S. with two staff members attending:

Labor 192 hours @ $70/hour $13,440

Travel 4 trips x 2 staff members@$1500/trip $12,000

Data Analysis/Reports 120 hours@ $70/hour $8,400

Estimated total cost per year: $33,840



15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.

New collection


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

Data collected will be analyzed and compiled into a Usability Report, which will summarize tasks, questions, and responses from users. No complex statistical calculations are required, merely figuring percentages of users who complete/do not complete tasks and who respond to questions favorably/unfavorably (ex: 47% of users found search difficult to use.). The report will be compiled by a member of the design/usability team and disseminated internally. Schedules will vary, depending on customer requirements; however, a tentative schedule is provided below.


Usability Data Collections

FY07

FY08

FY09

ESMD Focus Groups

Q3, Q4

 

 

NASA.gov/For Students (current site)

Q3, Q4

 

 

NASA.gov/For Educators (new site)

Q4

 

 

ESMD Product Usability Testing

 

Q2

 

NASA.gov/For Students (new site)

 

 

Q2

NASA Kids' Club (Phase 2)

 

 

Q3

Podcasting Focus Groups

Q3, Q4

 

 

Podcasting Usability Testing

Q4

Q2

Q2

NASA DLN Focus Groups

Q2, Q4

Q2, Q4

Q2, Q4


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

n/a


18. Explain each exception to the certification statement identified in Item 19, "Certification for

Paperwork Reduction Act Submissions," of OMB Form 83-I.

n/a



B. Collections of Information Employing

Statistical Methods

The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When Item 17 on the Form OMB 83-I is checked, "Yes," the following documentation should be included in the Supporting Statement to the extent that it applies to the methods proposed:


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

50-100 respondents per usability project will be sought. We are not doing a statistical survey online or offline. Questionnaires may be used after usability task testing to gather input, but the questionnaires are not formal surveys; please see Question 2>Questionnaires for details on page 3 of this document. No complex statistical calculations are required, merely figuring percentages of users who complete/do not complete tasks and who respond to questions favorably/unfavorably (ex: 47% of users found the search function difficult to use). We will not use weighted responses; each user’s response will be considered.


2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample

selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in

the justification,

* Unusual problems requiring specialized sampling

procedures, and

* Any use of periodic (less frequent than annual) data

collection cycles to reduce burden.

No complex statistical calculations are required, merely figuring percentages of users who complete/do not complete tasks and who respond to questions favorably/unfavorably (ex: 47% of users found search difficult to use.).


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

Participation in NASA Education usability studies is strictly voluntary. Educators who have participated in NASA Education workshops or programs may be contacted to gauge their interest in participating in usability studies (Ex: NASA Educator Astronaut Teachers, who are located nation wide). Reliability of the data is not an issue, as much of the usability data gathered is based on users’ ability to complete tasks on the NASA.gov system, as well as their feedback on the site and its information.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the

main collection of information.

Samples of the usability tasks and usability-related questions are provided for various target user groups. Additionally, a copy of a Usability Report compiled after seven K-12 educators conducted usability testing on the Educator section of NASA.gov is provided as a sample towards the end of this document.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

Due to the simplicity of the calculations used (calculating percentages), no external consultation is needed. Contractors supporting usability testing efforts for NASA Education agency wide may participate, including but not limited to:

Dawn Gaddis (UNITeS/SAIC), Marshall Space Flight Center, Huntsville, AL

Deana Nunley (UNITeS/SAIC), Marshall Space Flight Center, Huntsville, AL

Jennifer Wall (UNITeS/SAIC), Marshall Space Flight Center, Huntsville, AL



Attachments:

Collection Schedule

Collection Abstracts

Burden Table

Usability Justification (additional information provided to NASA Headquarters)

User-Centered Design

Sample Usability Instruments

Sample Usability Report

10 of 10

File Typeapplication/msword
Authorgaddisd
Last Modified ByWalter Kit
File Modified2007-02-07
File Created2007-02-07

© 2024 OMB.report | Privacy Policy