Agency response to public comment

NLM Response to OMB for Public Comments - 24Mar09.docx

Information Program on Clinical Trials: Maintaining a Registry and Results Databank (NLM)

Agency response to public comment

OMB: 0925-0586

Document [docx]
Download: docx | pdf

DEPARTMENT OF HEALTH & HUMAN SERVICES

National Institutes of Health

Bethesda, Maryland 20892


March 24, 2009


TO: Office of Management and Budget (OMB)


Through: Project Clearance Chief, NIH

Project Clearance Liaison, NLM


FROM: Jerry Sheehan, Assistant Director for Policy Development, NLM

SUBJECT: Agency Response to Public Comments


Provided below are the agency’s responses to the issues raised in the single set of comments that were received by OMB in response to the 30-day Federal Register Notice (Citation: 74 FR 448 dated 01/06/09) announcing the National Institutes of Health’s request for clearance of the Information Program on Clinical Trials: Maintaining a Registry and Results Databank (OMB Control No: 0925-0586). The responses are structured around the three topics identified in the Federal Register Notice, which are the same as those used by the commenter.



1. Comments on burden estimate

The commenter suggests that the burden associated with updates to recruitment status of individual sites is not included in the burden estimate for registration. In fact, the burden estimate for registration does include an estimate of the time required to update changes in recruitment status. It is considered part of the 8 updates that are accounted for in the burden statement. While some large clinical trials with many sites may require more than 8 updates of recruitment status over the lifetime of the trial, many more studies will require fewer updates because they involve fewer sites or because they are completed on a shorter timeframe (many trials run for only a few weeks or months). The estimate of 8 updates per trial reflects experience to-date with the registry. Even among trials that have been registered since FDAAA was enacted in September 2007 (and are subject to the requirement to update any changes in recruitment status monthly), an examination of a small sample of trials showed most trials had a total of only 4 to 5 updates (including site status updates), even if they involved a large number of sites.

The commenter also states that the time for initial submission is substantially underestimated. It must be recognized that clinical trials vary considerably in their size, complexity, and number of outcome measures; the time to report results will therefore vary considerably from one trial to another, more so than the time for registration. The burden statement included an estimated 10 hours for an initial submission, plus 5 hours for each of two updates. These figures were based on pilot studies conducted by the agency and feedback from a large number of organizations that tested preliminary versions of the results database during the summer of 2008. These experiences suggested that initial submission of results information required as much as 6 to 8 hours, using an interactive process to enter data directly into the system (as opposed to an XML uploading system), and that the time to report results decreases significantly as users become more familiar with the data entry system. The estimate explicitly includes time for verification and quality control by data submitters, though it would be expected that considerable data analysis and verification would be done as part of the trial itself, in keeping with the study protocol, and not exclusively for purposes of reporting under FDAAA.

We do not believe this comment justifies a modification to the burden estimate at this time. A more accurate assessment could be made at a later date, when there is more experience upon which to draw.

2. Comments on enhancing the quality, utility, and clarity of the information to be co collected

The commenter makes a number of suggestions for further improving the quality, utility, and clarity of the information collection, through changes to pick-lists, character limits, etc. The agency received many such comments during development of the system (in particular when preliminary versions of the system were made available for public comment during the summer of 2008) and considered them in the current system design. We anticipate that periodic improvements will continue to be made to the data collection instrument to streamline the data submission process, based on user feedback, such as this, and thank the commenter for their suggestions.

The commenter makes a few suggestions that deserve a specific response. For example, the commenter suggests eliminating the requirement that data submitters describe the scale used for measuring outcomes or developing a glossary of common scales. The agency believes that including descriptive information about measurement scales is essential to making results information understandable by patients, clinicians, and other researchers who are not experts in the particular field in which a particular scale is used. Such scales are highly domain specific, and without explanatory information, many users will not understand what outcome is being measured in a particular study (e.g., that GOG Performance Status refers to the Gynecological Oncology Group performance measure), or which group of subjects performed better (e.g., is a score of 5 on the GOG index better or worse than a score of 1?). The agency is considering the development of a glossary of commonly used outcome measurement scales (which it would do in consultation with the clinical trial community), but prefers to wait until more scale information has been submitted to the databank and it becomes more apparent which scales are commonly used and how often they are modified to suit the needs of particular studies. Similarly, as regards adverse events, the agency is considering additional information it could include in the data bank that would help system users/patients understand and interpret adverse event information. It is anticipated that such information can be included in the databank before adverse event reporting becomes mandatory. As for adaptive trial designs, the current system can handle a wide variety of trial designs, including adaptive trials. Assistance is available to data submitters to help them to find appropriate ways to submit their information.



3. Comments on how to minimize the burden of the collection of information

The commenter makes several suggestions for minimizing the burden of the data collection, including use of XML uploads, pulling data from FDA data sets, and use of global data exchange standards. As noted by the commenter, ClinicalTrials.gov has developed an XML schema to facilitated data uploads. We expect that more data submitters will transition to XML uploads for submitting information once the information collection information is approved and the associated regulations are promulgated. Unfortunately, the ability to pull in data from FDA is extremely limited. Data submitted to the FDA do not correspond to the statutory requirements established for this data collection (FDA does not generally receive summary information); nor can FDA data be made available to the public, as is necessary for the data bank. Regarding data standards, the agency consulted broadly with the health data standards community (in which it plays a significant role) and found that there are no existing standards for summary clinical trial information. Indeed the development of the ClinicalTrials.gov results reporting system stimulated efforts to develop standardized approaches for aggregating more detailed clinical trials information into summary reports.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorNIH/NLM/JS
File Modified0000-00-00
File Created2021-02-04

© 2024 OMB.report | Privacy Policy