Office of Nuclear Material Safety and Safeguards State
Procedure Approval
Reviewing the Common Performance Indicator,
Status of Materials Inspection Program Interim State Agreements (SA) Procedure SA-101
Issue Date: December xx, 2019
Review Date: December xx, 2024
Michael C. Layton, Director Signature
NMSS/MSST
Date:
and
Rulemaking Programs
Paul Michalak, Branch Chief Signature:
NMSS/MSST/SALB
Date:
Joe O’Hara, Procedure Contact Signature:
NMSS/MSST/SALB
Date:
MLXXXXXXXX
NOTE
Any changes to the procedure will be the responsibility of the NMSS Procedure Contact. Copies of NMSS procedures are available through the NRC Web site at https://scp.nrc.gov
I. INTRODUCTION
This document describes the procedure for conducting reviews of the U.S. Nuclear Regulatory Commission (NRC) and Agreement State radiation control programs using the common performance indicator, Status of Materials Inspection Program [Management Directive (MD) 5.6, Integrated Materials Performance Evaluation Program (IMPEP)].
II. OBJECTIVES
A. To verify that initial inspections and inspections of Priority 1, 2, and 3, licensees are performed at the frequency prescribed in NRC Inspection Manual Chapter (IMC) 2800, Materials Inspection Program.
B. To verify that candidate licensees working under reciprocity are inspected in accordance with the criteria prescribed in IMC 2800 or compatible policy developed by Agreement State programs using a similar risk-informed performance-based approach.
C. To verify that deviations from inspection schedules are approved by program management and that the reasons for the deviations are documented.
D. To verify there is a plan to perform any overdue inspections and reschedule any missed or deferred inspections. To determine a basis has been established for not performing any overdue inspections or rescheduling any missed or deferred inspections.
E. To verify that inspection findings are communicated to licensees within 30 calendar days, or 45 calendar days for a team inspection, after inspection completion as specified in IMC 0610, Nuclear Material Safety and Safeguards Inspection Reports and Inspection Manual Chapter (IMC) 2800.
III. BACKGROUND
Periodic inspections of licensed activities are essential to ensure that activities are conducted in compliance with regulatory requirements and consistent with good safety and security practices. Inspection frequency, designated by a priority code, is based on the relative risk of the radiation hazard of the licensed activity. For example, a Priority 1 licensee presents the greatest risk to health and safety of workers, members of the public, and the environment; therefore, Priority 1 licensees require the most frequent inspections. Information regarding the number of overdue inspections is a significant measure of the status of a radioactive materials inspection program.
IV. ROLES AND RESPONSIBILITIES
A. Team Leader
In coordination with the IMPEP Program Manager, the Team Leader determines which team member is assigned lead review responsibility and assigns other team members to provide support, as necessary.
2. Communicates the team’s findings to Program Management and ensures that the team’s findings are in alignment with Management Directive (MD) 5.6, Integrated Materials Performance Evaluation Program (IMPEP).”
B. Principal Reviewer
1. Reviews relevant documentation, conducts management and staff discussions, and maintains a summary of all statistical inspection information received.
2. Calculates the percentage of Priority 1, 2, 3, and initial inspections completed overdue in accordance with Appendix A of this procedure.
3. Verifies that reciprocity inspections are completed in accordance with the NRC’s Inspection Manual Chapter 2800, Materials Inspection Program.
4. Reviews inspection communications sent to licensees to verify that findings are communicated to the licensee in accordance with the NRC’s Inspection Manual Chapter 2800, Materials Inspection Program.
5. Informs the Team Leader of their findings throughout the review.
6. Completes their portion of the IMPEP report for the performance indicator reviewed.
7. Attends the IMPEP Management Review Board meeting and is prepared to discuss their findings (this can be done either in-person or via teleconference).
V. GUIDANCE
A. Scope
The IMPEP Team should follow the guidance provided in SA-100, Implementation of the Integrated Materials Performance Evaluation Program (IMPEP), regarding discussions related to this indicator with inspectors, supervisors, and managers. If performance issues are identified by the reviewer(s) that lead to programmatic weaknesses, the IMPEP Team should seek to identify the root cause(s) of the issues which can be used as the basis for developing recommendations for corrective actions. As noted in Section II.A.3, SA-100 contains criteria regarding the development of recommendations by the IMPEP team.
2. In terms of general guidance for the IMPEP review team, a finding of "satisfactory" should be considered when none or only a few or small number of the cases or areas reviewed involve performance issues/deficiencies (e.g., inspection, licensing, staffing, etc.); an "unsatisfactory" finding should be considered when a majority or a large number of cases or areas reviewed involve performance issues/deficiencies, especially if they are chronic, programmatic, and/or of high-risk significance; and a finding of "satisfactory, but needs improvement" should be considered when more than a few or a small number of the cases or areas reviewed involve performance issues/deficiencies in high-risk-significant regulatory areas, but not to such an extent that the finding would be considered unsatisfactory.
3. This procedure evaluates the quantitative performance of routine Priority 1, 2, 3 and initial inspections of the NRC or Agreement State program and inspections of Priority 1, 2, and 3 reciprocity candidate licensees since the last IMPEP review.
4. This procedure does not apply to the status of inspections related to the non-common indicators, i.e. uranium recovery program, sealed source and device evaluation program, and low-level radioactive waste disposal program. Refer SA-108 through SA-110 for specific instructions applicable to non-common indicator reviews.
5. If performance deficiencies are identified, review team members should consider whether the root causes of these deficiencies affect more than the Status of Material Inspection Program Indicator. Issues impacting one performance indicator could also have a negative impact on performance with respect to other indicators. As a general matter, a performance deficiency, and associated root causes, should be assigned to only the most appropriate indicator and not counted against multiple indicators.
B. Evaluation Process
The principal reviewer should refer to Part III, Evaluation Criteria, of MD 5.6 for specific evaluation criteria. As noted in Management Directive 5.6, the criteria for a satisfactory program is as follows:
Less than 10 percent of initial and high priority licensees (Priority 1, 2, and 3) are inspected at frequencies greater than those prescribed in IMC 2800 or compatible Agreement State procedure.
Inspection findings are communicated to the licensee according to the criteria prescribed in IMC 2800 and or compatible Agreement State procedure.
Reciprocity inspections are performed in a manner that meets the requirements identified in IMC 2800 and applicable guidance, or compatible Agreement State procedures, or the Agreement State program has developed and successfully implemented an alternative policy for reciprocity inspections in lieu of IMC 2800 and applicable guidance, using a similar risk-informed, performance-based approach for determining reciprocity licensees that are candidates for inspection.
2. The principal reviewer should examine any information on the status of routine Priority 1, 2, 3 and initial inspections and Priority 1, 2, and 3 reciprocity
inspections completed by the NRC or Agreement State program during the review period.
If available, the principal reviewer should examine the inspection information contained in the program’s database. Information can be obtained through Web Based Licensing (WBL) by running a query against the licensing actions and inspection activities; or,
b. If the program does not have a database or such data cannot be easily retrieved or provided, to cross-reference and verify information, the reviewer should examine a representative number of Priority 1, 2, and 3 and candidate reciprocity inspection records, as well as other relevant documents involving inspection findings, using the following guidance:
i. All inspections performed since the last IMPEP review are candidates for review.
ii. The principal reviewer should perform a risk-informed sample of the program’s inspections based on safety and security significance. The selected inspection casework should focus on the program’s highest-risk licensees. The use of risk-informed sampling, rather than random sampling, maximizes the effectiveness of the review of casework. The sampling should also ensure inclusion of the full range of Priority 1, 2, and 3 modalities licensed by the NRC and Agreement States (e.g. industrial, medical, academic). as well as a representative sample of security inspections of high-risk IAEA Category 1 and 2 sources and service provider licensees.
3. As part of the evaluation criteria for this indicator, the principal reviewer will determine the percentage of overdue Priority 1, 2, and 3, and initial inspections for the review period. Appendix A contains guidance for the overdue inspection calculation with a sample worksheet for use by the principal reviewer.
a. Routine inspections of Priority 1 and 2 licensees are considered overdue if the inspections exceed the IMC 2800 frequencies plus the following applicable maximum window:
i. Priority 1 inspections completed greater than six
months past the inspection due date;
ii. Priority 2 inspections completed greater than one year past the inspection due date; and,
iii. Routine inspections of Priority 3 and 5 licensees and telephonic contact of Priority T licensees are considered overdue if the inspections or contact exceed the IMC 2800 frequencies plus one year.
iv. Initial inspections are normally considered overdue if the inspections are performed greater than 12 months after the date of issuance of the license, however, if the licensee does not yet possess licensed materials or has not yet performed any principal activities, the initial inspection may be rescheduled to within 18 months of license issuance.
c. Reciprocity inspections are evaluated separately and should not be included in the calculation.
d. The principal reviewer should use the information and definitions in IMC 2800 when determining the status of inspections. If the NRC or Agreement State defines overdue inspections using different definitions, a reasonable attempt should be made to make the calculation using the information and definitions from IMC 2800. This may be achieved by reviewing inspection casework files and applying the information to the worksheet in Appendix A. If the reviewer is unable to calculate the status of inspections using the information and definitions in IMC 2800, the reviewer may use the NRC or Agreement State's data or information but must note the differences in terminology or definitions in the IMPEP report.
4. The principal reviewer should attempt to ascertain the reason(s) for any overdue inspections. This can be accomplished through discussions with individual inspectors as well as Program management.
5. The review should include an assessment of the issuance of inspection findings. Inspection findings in most cases should be provided to licensees within 30 days of the exit meeting with the licensee or 45 days of the exit meeting with the licensee for a team inspection, or a time period specified in the compatible Agreement State procedure.
6.. The performance of reciprocity inspections of Priority 1, 2, and 3 candidate licensees should be evaluated in comparison to the criteria in IMC 2800 or alternative compatible Agreement State procedure.
7. While this indicator primarily focuses on quantitative performance, review of this indicator should also include a qualitative evaluation of an Agreement State Program’s inspection frequencies. If the Agreement State Program deviates from the frequencies as established in IMC 2800 the principal reviewer should evaluate what if any health, safety, and/or security impacts have occurred as a result of the deviation. Additionally, the principal reviewer should ensure documentation exists that justifies why the deviation in inspection frequency exists.
8. In applying the criteria, flexibility may be used to make the determination of the rating for this indicator. The review team should consider the status of the program and any mitigating factors that may have prohibited the program from conducting inspections during the review period. The review team’s assessment should include the examination of plans to perform any overdue inspections or reschedule any missed or deferred inspections. The principal reviewer should determine that a basis has been established by the program for not performing any overdue inspections or rescheduling any missed or deferred inspections.
C. Review Guidelines
1. The response generated by the NRC or Agreement State to relevant questions in the IMPEP questionnaire should be used to focus the review.
2. The principal reviewer should be familiar with IMC 2800, which prescribes inspection frequencies. The principal reviewer should also be cognizant of any additional inspection guidance, such as Temporary Instructions, that may describe deviations in inspection frequencies.
3. The principal reviewer should use inspection data provided in the questionnaire and information provided during the on-site review. An Agreement State program should not be penalized for failing to meet internally-developed inspection schedules that are more aggressive (i.e. licensees or license types that are more frequently inspected) than those specified in IMC 2800.
4. To evaluate the status of materials and security inspections, the principal reviewer should evaluate the following:
a. The number of Priority 1, 2, and 3, and initial inspections completed overdue during the review period and overdue at the time of the review;
b. The amount of time past the applicable inspection due dates for any Priority 1, 2, and 3, and initial overdue inspections;
c. The reason Priority 1, 2, and 3, and initial inspections were completed overdue or are overdue at the time of the review;
d. The safety or security significance of not performing or deferring any overdue inspections;
e. The timeliness of issuance of inspection findings to licensees;
f. The inspection frequencies used by an Agreement State. The principal reviewer should verify they are at least as frequent as those listed in IMC 2800. The principal reviewer should document any Agreement State inspection frequencies that are conducted at frequencies that are longer than those specified in IMC 2800, the Program’s rationale for conducting them at a greater frequency, and any impacts to health, safety, security, or the environment; and
g. The performance of Priority 1, 2, and 3 reciprocity inspections in accordance with the guidance in, or the details of and justification for the NRC or Agreement State’s alternative reciprocity inspection policy.
E. Review Information Summary
At a minimum, the summary maintained by the principal reviewer should include the following information:
1. Number of Priority 1, 2, and 3 inspections that were completed on time during the review period;
2. Number of Priority 1, 2, and 3 inspections that were completed overdue during the review period, and the range of time past due the inspections were completed;
3. Number of Priority 1, 2, and 3 inspections that are overdue at the time of the review, and the range of time past due the inspections are at the time of the review;
4. Number of initial inspections that were completed on time during the review period;
5. Number of initial inspections that were completed overdue during the review period, and the range of time past due the inspections were completed;
6. Number of initial inspections that are overdue at the time of the review, and the range of time past due the inspections are at the time of the review;
7. Number of reciprocity licensees that were candidates for inspection for each year of the review period and the number of reciprocity inspections of candidate licensees that were completed during each year of the review period; and
8. Number of inspection findings from Priority 1, 2, and 3, and initial inspections that were issued to the licensees more than 30 days, or 45 days for a team inspection, after the inspection exit meeting was held and the amount of time past the 30/45-day date that the late inspection findings were sent or are overdue. The principal reviewer should also document the reason any inspection findings were dispatched overdue.
F. Discussion of Findings with Radiation Control Program
The IMPEP team should follow the guidance given in SA-100, Implementation of the Integrated Materials Performance Evaluation Program (IMPEP), for discussion of technical findings with staff, supervisors, and management. If performance issues are identified that lead to programmatic weaknesses, the team should seek to identify the root cause(s) of the issues which can be used as the basis for developing recommendations for corrective actions.
VI. APPENDIXES
A. Overdue Inspection Calculation Worksheet
B. Frequently Asked Questions
C. Examples of Less than Satisfactory Findings of a Program Performance from previous IMPEP’s
VII. REFERENCES
Management Directives (MD) available at https://scp.nrc.gov
NMSS SA Procedures available at https://scp.nrc.gov.
NRC/Agreement State Working Groups available at https://scp.nrc.gov.
Appendix A
Overdue Inspection Calculation Worksheet
Guidance for calculating the number of overdue inspections:
1. Inspections considered in the calculation are Priority 1, 2, and 3 inspections and all initial inspections. An inspection will be considered overdue if it falls under one of the following cases:
a. A Priority 1 inspection completed greater than 6 months past the inspection due date (18 months since the start of the last inspection);
b. A Priority 2 inspection completed greater than 12 months past the inspection due date (36 months since the start of the last inspection);
c. A Priority 3 inspection completed greater than 12 months past the inspection due date (48 months since the start of the last inspection); and
d. An initial inspection completed greater than 12 months from the date of license issuance, or greater than 18 months if the licensee did not possess licensed material in the first 12 months.
2. Inspections are always compared to NRC Priorities in IMC 2800.
3. Multiple overdue inspections for the same licensee are counted as a single event. Depending on the Priority, there may be more than one inspection for a specific licensee conducted during the review period. However, if more than one inspection is significantly overdue and/or not yet completed, the principal reviewer should count them as one missed or overdue inspection but should note examples of the overdue ranges for the IMPEP report.
For example, if only one inspection was conducted for a Priority 1 licensee during a four-year period, for the purpose of the overdue inspection calculation, this would be considered one (1) overdue inspection and the reviewer should note the number of months exceeding the 18-month period. Even though the inspection could be overdue 30 months, it would still be counted as one (1) overdue inspection.
Appendix A (continued)
4. The percentage of overdue inspections during the review period should be calculated as follows:
% overdue = 100 x
Number of Priority 1, 2, and 3 and initial inspections not completed on time per NRC IMC 2800
Number of Priority 1, 2, and 3 and initial inspections that should have been completed
To determine the numerator and denominator:
% overdue = 100 x
(PCO + PU + ICO + IU)
(PCO + PU + ICO + IU + PC + IC)
Where:
PCO = number of Priority 1, 2, and 3 inspections completed overdue during the review period
PU = number of Priority 1, 2, and 3 inspections overdue at the time of the review
PC = number of Priority 1, 2, and 3 inspections completed on time during the review period
ICO = number of initial inspections completed overdue during the review period
IU = number of initial inspections overdue at the time of the review
IC = number of initial inspections completed on time during the review period
5. The following is a sample calculation:
Say the Program performed 80 Priority 1, 2, and 3 inspections on time during the review period and ten (10) Priority 1, 2, and 3 inspections were performed overdue during the review period. Additionally, at the time of the review there was two (2) Priority 1, 2, or 3 inspections that are still overdue. The Program performed ten (10) initial inspections on time during the review period and performed five (5) initial inspections overdue during the review period. At the time of the review, there was one (1) initial inspection that was still overdue.
PCO = 10
PU = 2
PC = 80
ICO = 5
IU = 1
IC = 10
So:
% = 100 x (PCO + PU + ICO + IU)
(PCO + PU + ICO + IU + PC + IC)
= 100 x (10 + 2 + 5 + 1)
(10 + 2 + 5 + 1 + 80 + 10)
= 100 x 18 = 16.7%
108
Appendix A (continued)
INSPECTION STATUS
REVIEWER WORKSHEET
STATE/NRC______________________
Time Period covered by IMPEP Review _____________________________
One entry per inspection
Entry |
Licensee Name |
License Number |
Priority 1,2,3 or Initial |
Last inspection date or License issued date if initial inspection |
Date Due |
50% window for Priority 1 and 2, 1 year window for Priority 3; no window for initials |
Date Performed |
Amount of Time Overdue |
Date inspection completed |
Date inspection findings issued |
Report issued within 30 days, 45 days for Team inspection
If not, days over |
Notes |
0 |
Sample company |
12-2345 |
1 |
1/1/13 |
1/1/14 |
7/1/14
|
9/1/14 |
2 months |
9/1/14 |
9/1514 |
Yes |
|
0 |
Sample company
|
23-4567
|
Initial |
5/1/13 |
5/1/14 |
N/A |
7/1/14 |
2 months |
7/3/14 |
8/20/14 |
No 18 days |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Appendix B
Frequently Asked Questions
Q1: Is there any leniency to counting overdue inspections of Priority 1, and 2 licensees as the NRC IMC 2800 frequency plus 50 percent?
A1: No. For Priority 1, and 2, inspections completed over the 50 percent, the inspection should be considered overdue and documented as such in the calculation. Review teams may take other mitigating factors into consideration and describe them in the narrative portion of the report as appropriate.
Q2: If a program inspects a Priority 1 licensee only once in a 3-year period, why do we only count that as one overdue inspection?
A2: IMPEP policy is to credit the program for the inspections they perform yet keep track of how late overdue inspections were eventually conducted. Thus, inspections that “should have been performed” are not double or triple counted in the calculation, but the reviewer should document how late the overdue inspection was performed or if it is still overdue at the time of the review.
Q3: How important is the overdue inspection calculation to the rating for this indicator? For example, what if the number of overdue inspections turns out to be just under or over 25 percent?
A3: The overdue inspection calculation is just one piece of information that the review team uses to determine the appropriate rating for this indicator. Regardless of how close a calculation is to 25 percent (or 10 percent), the review team should take the program’s overall performance involving the other aspects of this indicator, the root cause of the overdue inspections, and the program management’s actions to address the issues into account when determining an appropriate rating for this indicator.
Q4: What if the data necessary to perform the overdue calculation is not easy to get or determine?
A4: In this case, the review team should sample as many inspections as possible to help determine the rating for this indicator and note in the report that only a sampling was performed. This means that the team members will need to pull files and review information from inspection reports. The review team will need to document in the report the values and assumptions used for the overdue calculation based on the sampling. If possible, the review team should include in the report the total number of Priority 1, 2, and 3 and initial inspections conducted by the program during the review period, as well as the number that were overdue for inspection at the time of the review.
Q5: What if a State deviates from the inspection frequencies prescribed in IMC 2800?
A5: Overdue inspections are not determined based on the inspection frequencies established by any Agreement State. The inspection frequencies in IMC 2800 are used as the baseline metric for determining if an inspection is overdue. A number of Agreement States have more aggressive inspection schedules than those prescribed in IMC 2800. However, in cases where an Agreement States inspection frequency is less stringent than IMC 2800, the review team should note the difference(s) and determine if there are performance issues as a result. Several Agreement States have set less stringent frequencies for certain categories of licensees. The State needs to have a documented rationale for the difference(s) and the Management Review Board will make the final determination if public health and safety are jeopardized based on the difference(s).
Q6: What if a State conducted many Priority 1, 2, and 3, and initial inspections overdue during the review period as a result of staff turnover, but have caught up on all the overdue inspections at the time of the review?
A6: If a State has no overdue inspections at the time of the review and has addressed the root cause of the overdue inspections, then there may not be any performance issue and as such, a finding of satisfactory may be appropriate (also taking into consideration the other factors for this indicator). However, if the State has not addressed the root cause of the overdue inspections or has not developed a management plan or other effort to address the issue, then a rating of satisfactory, but needs improvement, or unsatisfactory may be appropriate (also taking into consideration the other factors for this indicator). Additionally, review teams may make specific recommendations to address these types of performance issues.
Q7: For the initial inspections, are only Priority 1, 2, and 3 licensees counted in the calculation?
A7: No. When determining the number of initial inspections performed or overdue, all initial inspections must be included. This includes initial inspections of all priority codes, including Priority 5.
Q8: If the State has an alternative policy for reciprocity inspections, what criteria should the IMPEP team use to determine if the alternative policy is acceptable?
A8: The alternative policy allows the states the maximum flexibility in crafting a policy unique to their own circumstances to ensure a risk informed, performance based, approach to inspecting reciprocity licensees. Given the unique circumstances that each State faces, the NRC concluded that setting fixed numerical minimum requirements was not a practical approach.
If an Agreement State does have an alternative policy, the review team should review the State’s policy and consider the following items:
Does the policy have clear criteria to select the candidate licensees that operate under reciprocity?
Does the policy have performance measures to demonstrate that the State is conducting a sufficient number and type of reciprocity inspections to ensure safety and security are protected and the program does not create any gaps, duplicates or gaps in the National Materials Program?
If the State does not meet its performance measure under the alternative policy, was an evaluation conducted and corrective action taken?
Does the policy require a representative sample of high risk (priority 1 through 3), IAEA Category 1 and 2 sources and service providers to be inspected?
Does the policy take into account the compliance history of the reciprocity licensee?
Does the policy achieve a consistency in the selection process by ensuring the right number of licensees are inspected and not a specified percentage of license types.
The IMPEP team should review an Agreement State’s alternative policy in the same manner as those States who use the criteria in IMC2800. The questions above can serve as guide, but the team should be open to innovative approaches used by the Agreement States to select candidate licensees.
Appendix C
Examples of Less than Satisfactory Findings of a Program’s
Performance from Previous IMPEP Reviews
NOTES:
The effectiveness of a program is assessed through the evaluation of the criteria listed in Section III, Evaluation Criteria, of MD 5.6. These criteria are NOT intended to be exhaustive but provide a starting point for the IMPEP review team to evaluate this indicator. The review team should also take into consideration other relevant mitigating factors that may have an impact on the program’s performance under this performance indicator. The review team should consider a less than satisfactory finding when the identified performance issue(s) is/are programmatic in nature, and not isolated to one aspect, case, individual, etc. as applicable.
This list is not all inclusive and will be maintained and updated in the IMPEP Toolbox on the state communications portal website.
Consideration should be given to a finding of “satisfactory but needs improvement” when a review demonstrates the presence of one or more of the following conditions.
1. More than 10 percent, but less than 25 percent, of Priority 1, 2, and 3 and initial inspections were inspected at intervals exceeding the frequencies prescribed in IMC 2800 or compatible Agreement State procedure.
2. Inspection findings of non-compliance are not issued to the licensee according to the criteria specified in this procedure or compatible Agreement State procedure in more than a few, but less than most of the cases reviewed. (ex. The team identifies that a Program issued 5 of the 30 inspection reports greater than 30 days after the inspection exit. All inspections except one were clear inspections. The team determined that the 4 clear inspection findings were issued late due to a backlog of work on the Program Supervisor’s desk.)
3. A program does not meet the reciprocity inspection criteria defined in IMC 2800 or compatible Agreement State procedure, or its own alternative policy in one or more calendar years during the review period.
Consideration should be given to a finding of “unsatisfactory” when a review demonstrates the presence of significant performance issues with respect to the other indicators that are determined to be related to one or more of the following conditions.
More than 25 percent of Priority 1, 2, and 3 and initial inspections were inspected at intervals exceeding the frequencies identified in IMC 2800 or compatible Agreement State procedure.
Inspection findings are not issued to the licensee according to the criteria specified in this procedure or compatible Agreement State procedure in most cases reviewed. (Ex.: The team identifies that a Program issued 18 of the 30 inspection reports greater than 30 days after the inspection exit. All inspections except one were clear inspections. The team determined that the 17 clear inspection findings were issued late due to a backlog of work on the Program Supervisor’s desk.)
A program does not meet the reciprocity inspection requirements as defined in IMC 2800 or compatible Agreement State procedure, or its own alternative policy in three or more calendar years during the review period.
Note: This list is not all inclusive and will be maintained and updated in the IMPEP Toolbox on the state communications portal at https://scp.nrc.gov.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | SA-101 Status |
Author | Document Conversion |
File Modified | 0000-00-00 |
File Created | 2021-01-15 |