Supporting Statement A for
National Institute of Drug Abuse (NIDA)
Adolescent Brain Cognitive DevelopmentSM Study (ABCD Study®) – Audience Feedback Teams
OMB Control Number 0925-0781; Exp 03/31/2027
This is a revision to the original submission, and all changes are highlighted in yellow.
Date: September 12, 2024
Check off which applies:
New
X Revision
Reinstatement with Change
Reinstatement without Change
Extension
Emergency
Existing
Federal Government Employee Information:
Name: Kimberly LeBlanc
Address: 3WFN Room 09C75 MSC 6021, Gaithersburg, MD
Telephone: 301-827-4102
Fax: 301-443-9127
Email: [email protected]
Table of Contents
A. ABSTRACT
A.1 Circumstances Making the Collection of Information Necessary
A.2 Purpose and Use of the Information Collection
A.3 Use of Improved Information Technology and Burden Reduction
A.4 Efforts to Identify Duplication and Use of Similar Information
A.5 Impact on Small Businesses or Other Small Entities
A.6 Consequences of Collecting the Information Less Frequently
A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency
A.9 Explanation of Any Payment or Gift to Respondents
A.10 Assurance of Confidentiality Provided to Respondents
A.11 Justification for Sensitive Questions
A.12 Estimates of Hour Burden Including Annualized Hourly Costs
A.13 Estimate of Other Total Annual Cost Burden to Respondents or Record Keepers
A.14 Annualized Cost to the Federal Government
A.15 Explanation for Program Changes or Adjustments
A.16 Plans for Tabulation and Publication and Project Time Schedule
A.17 Reason(s) Display of OMB Expiration Date is Inappropriate
A.18 Exceptions to Certification for Paperwork Reduction Act Submissions
Attachments
Att A — Parent/Caregiver Phone Screener and Invitation
Att A — Parent/Caregiver Phone Screener and Invitation Spanish
Att B — Parent/Caregiver Consent
Att B — Parent/Caregiver Consent Spanish
Att C — Parent or Guardian Permission for Teens
Att C — Parent or Guardian Permission for Teens Spanish
Att D — Teen Phone Screener
Att E – Teen Assent
Att F – Teen 18+ Consent
Att G – Teen Web Survey
Att H – Parent/Caregiver Web Survey
Att I – Teen Virtual Discussion Group Guide
Att J – Parent/Caregiver Virtual Interview Guide
Att K –Teen Online Bulletin Board Guide
Att L – Parent/Caregiver Online Bulletin Board Guide
Att M – Parent/Caregiver “At-Home” Materials Review
Att N – Sample Recruitment Email
Att O – Privacy Act Memo
Att P – Privacy Impact Assessment
This document reflects a revision to a package (0925-0781) which was originally submitted for approval in March 2024. The Adolescent Brain Cognitive DevelopmentSM Study (ABCD Study®) is the largest long-term study of brain and cognitive development in children across the United States. Findings from the Study will greatly increase our understanding of environmental, social, and genetic factors that affect brain and cognitive development. One of the Study’s key missions is to collect data that will inform just and equitable policies to elevate the quality of life for all youth, especially those who have been historically marginalized.
Using surveys, virtual interviews and virtual discussion groups, this collection will continue to gather feedback on clarity of and comfort with questions/sample collection processes from youth and parents/caregivers of youth who are peers of the longitudinal ABCD Study cohort (similar age, from racial and ethnic groups represented in the Study). The respondents only provide feedback on the test questions, they are not asked to answer the test questions, essentially a preliminary pilot test or “pre-test.” We recruited 36 youth and 15 parent/caregiver team members. For 36 youth feedback team members, we anticipated the need to screen up to 72 parents and 72 youth. The 15 parent/caregiver feedback team members were recruited from those 72 parents/caregivers screened in the identification of youth feedback team members. We will continue to conduct up to two information collection activities per feedback team member per year, for a total burden of 172 hours (inclusive of screening).
Recommendations from these pre-test findings continue to help improve the protocol for a more successful data collection in the ABCD Study. Pre-testing to ensure effective versions of questions and sample collection procedures means that the Study can gather high quality data on the links between teens’ experiences and their effects on developing brains and bodies.
The ABCD Study will require up to two rounds of pre-testing per year in order to enhance participant experience and ensure that data collection is not negatively impacted by incomplete or missing responses or unclear instructions/communications. When feedback on certain stimuli is complete, the stimuli will be removed from data collection instruments (attachments G-M) and replaced with new or revised stimuli to reflect the ABCD Study’s evolving needs for pre-testing.
The revision to this information collection denotes the plan to update the stimuli included in data collection instruments (web surveys, group discussion guides, online bulletin board guides, or at-home materials review) up to twice per year. This includes adding new stimuli when pre-testing needs are identified, updating stimuli to test revisions to the stimuli, and removing and replacing stimuli when feedback is complete.
Questions included in the instruments assess the following factors as needed for each stimuli:
Factors Assessed |
Research Questions |
Readability & Clarity |
Are survey questions, instructions, or study communications clearly written? Are there any survey questions that would be hard to answer as written? Are translations of questions, instructions, or communications clear and accurate? Did revisions (based on feedback) improve readability and clarity? |
Inclusiveness |
Are survey questions and response choices inclusive of diverse life experiences? Are survey questions and study communications appropriate for teens and parents/caregivers from all backgrounds and identities? |
Comprehension |
Do feedback team members understand the content of survey questions, instructions, or study communications? Did revisions (based on feedback) improve comprehension? |
Acceptability |
Is there anything in the survey questions, instructions, or study communications that may be unacceptable to the target audiences (teens, parents/caregivers)? Are examples and definitions provided complete and relevant to the target audiences? Do study communications appeal to the target audiences? |
Revisions will not increase the burden estimate for any instrument (as described in section A.12).
The potential for public benefit to be achieved through research data from the ABCD Study is significant. The extensive information collected by the ABCD Study and subsequently made available provides a rare and valuable resource to the scientific community. The National Institutes of Health aims to continue to collect feedback and recommendations for improving a selection of ABCD Study survey questions and Study communications materials. The feedback from individuals on the questions themselves (rather than answers to the questions) informs the ABCD Study protocol, refining questions to provide for a better experience for the ABCD Study participants and stronger Study results.
Authority for the collection of the information requested from recipients comes from the authorities regarding the establishment of the National Institutes of Health, its general authority to conduct and fund research and to provide training assistance, and its general authority to maintain records in connection with these and its other functions (42 U.S.C. 203, 241, 289l-1 and 44 U.S.C. 3101), and Section 301 and 493 of the Public Health Service Act.
The ABCD Study team identified that some questions about important events that influence cognitive development for adolescents can be potentially sensitive or challenging to answer (for example, experiencing violence or discrimination). The team also identified parts of the data collection process that could benefit from feedback on how to improve clarity or address questions/concerns Study participants may have (for example, obtaining consent to collect biological or genetic samples). This feedback activity will continue to solicit feedback from youth and parents/caregivers of youth who are peers of the Study cohort (similar age, from the racial and ethnic groups represented in the Study) on clarity of and comfort with questions and communications planned for the upcoming Study year.
Recommendations from this feedback help the Study team improve their protocol for a more-successful data collection in the larger Study. For example, suggestions from past audience feedback have made questions about experiences with discrimination more inclusive of the types and instances of discrimination teens can face. Audience feedback can also be used to understand why some topics/questions have lower response rates and subsequent missing data (when Study participants declined to answer) and generate ideas on better ways to collect that information. Finally, this audience feedback approach helps the Study include perspectives on questions and topics from members of various racial and ethnic groups by intentionally recruiting individuals from those backgrounds.
Given the utility of audience feedback for the ongoing implementation of the Study, this information collection approach continues to support a cohort of audience feedback team members who are familiar with the ABCD Study and participate in multiple feedback activities. By using a cohort approach, the overall burden required to familiarize new participants with the purpose of the Study and the expectations for audience feedback is minimized.
Audience feedback activities include a mix of asynchronous and scheduled, live data collection: web-based survey activities, virtual discussion boards, individual interviews, and discussions groups. Building on experience from past audience feedback exercises1, this approach maximizes the benefits of self-guided and moderator-guided data collection: participants are briefed on background and materials during asynchronous data collection and provide in-depth feedback during moderator-guided, scheduled sessions.
Feedback team members are not asked to provide any personal information in response to the questions in the protocol; instead, they give feedback on things to consider when the ABCD Study protocol is being implemented. (For example, whether questions are missing potential answer choices or if question wording may be offensive or unclear.) Insights from these discussions are then used to refine the fielding of the upcoming ABCD Study protocol.
Adults who care for a child the same age as the cohort of ABDC Study participants were recruited for parent/caregiver feedback teams. Youth feedback team members were recruited through parents/caregivers, since parents or legal guardians needed to give permission to participate for any youth under age 18. A recruitment firm (Hagen-Sinclair) using an opt-in panel of potential participants identified and recruited a mix of genders (e.g., male, female, another gender identity), racial and ethnic groups, caregiving role (e.g., mother, father, legal guardian), household income, parental education level, and geographic area. This sample is not intended to be representative of the U.S. population or the individuals who take part in the ABCD Study. It is responsive to characteristics of the Study population (such as the large percentage of mothers who participate in the parent/caregiver component) but intentionally over-samples for groups that may make up a small proportion of Study participants.
Following OMB approval in March 2024, NIDA ran a teen survey and qualitative discussions with teens and parents and caregivers. Findings from that research informed updates to tested stimuli. Several other groups of survey questions are no longer being asked of ABCD Study participants, and thus do not need to continue to be tested. New survey questions were identified that would benefit from pre-testing with intended audiences. This revision updates Attachments G through M.
Up to twice a year, we will need to update stimuli in Attachments G through M to test new survey questions and study materials. We will submit change memos to OMB up to twice a year to communicate about the non-substantive revisions to the information collection instruments.
All data collection takes place virtually to allow participation from any part of the U.S. and minimize the burden needed to join a discussion. To reduce burden from screening, we recruited a cohort of audience feedback teams in the first year of this information collection (Spring 2024). Feedback team members will continue to take part in no more than two (2) audience feedback activities (e.g., discussion group, interview, online bulletin board) per 12 months. If any team members later decide to leave the audience feedback team, we may screen additional individuals to ensure enough participants are available for that year’s audience feedback activities.
Any web survey feedback on the protocol questions to be tested before group discussions will be collected using Qualtrics, a secure internet-based survey software system that is FedRAMP-authorized. Qualtrics provides enterprise-grade security features including data encryption, redundancy, continuous network monitoring, and Single Sign On (SSO). Qualtrics is designed to work with both mobile devices and desktops, enabling users to complete the survey at their convenience. The instrument and survey software use computer-generated skip patterns to reduce the respondents’ overall burden, ensuring they only see questions that are relevant to them.
Individual interviews and group discussions are held on ZoomGov, a FedRAMP authorized videoconferencing provider. ZoomGov integrates with the commercially-available video conferencing software Zoom, allowing feedback team members to join the secure videoconference from app- or web-based platforms on their own device. All interviews and groups are recorded so that transcripts can be made available for data analysis. Assessing body language and facial expressions (as captured by note-takers during live sessions) is also part of the data gathering and observation process, and feedback team members are asked to turn on their video camera as part of their participation. The live video streaming will be terminated immediately at the end of each session.
Asynchronous qualitative data collection uses virtual discussion boards. Participants in these asynchronous discussions answer questions over a fielding period (generally 48-72 hours) using their desktop, laptop, mobile, or tablet device. Questions include multiple-choice polls, open-ended text responses, and mark-up activities where participants can add comments to images or text excerpts. Feedback team members can answer questions, reply to moderator probes, and comment on others’ responses at times of their own choosing, with the total participation time not to exceed the burden estimate.
A privacy impact assessment was conducted in March 2023.
These discussion groups collect information not captured elsewhere. The Study determines the inclusion of questions for each year of the protocol, and this is the only systematic audience feedback activity that tests multiple parts of the protocol ahead of fielding with members of the Study cohort. OMB approved information collection requests were also reviewed to identify potential areas of duplication across the federal government. The National Opinion Research Center (NORC) at the University of Chicago, under contract to ABCD, is conducting an audience feedback data collection activity related to sexual and gender minority (SGM) terminology. To avoid duplication, our data collection activity will not include exploration of sexual and gender minority questions.
This data collection does not involve small businesses.
For each year of the ABCD Study, the Study Team determines which measures, protocols and topics are included in that year’s data collection and which of those items should be tested with the audience feedback teams. Audience feedback activities must take place at least once per year in order to meet the schedule of the ABCD Study data collection.
There are no legal obstacles to reducing the burden.
This information collection fully complies with 5 CFR 1320.5(d)(2). No special circumstances are associated with this information collection that would be inconsistent with the regulation.
In accordance with 5 CFR 1320.8(d), a 60-day notice for public comment was published on October 2, 2023, page 67775 (88 FR 67775) in the Federal Register. There were no public comments received.
A.8.2 Efforts to Consult Outside the Agency
Outside of the agency, the chairs of the ABCD Study Working Groups (made up of participating investigators from Study sites across the country) have been consulted on instrument development. The full list of ABCD Study Working Groups can be found on the ABCD Study website at: https://abcdstudy.org/scientists/workgroups/.
To encourage participant participation and to convey appreciation for contributing to this important project, parent/caregiver and youth feedback team members receive the following tokens of appreciation, based on the type of activity completed:
Audience Feedback Activity |
Token Amount |
Teen Web Survey Activity (30 minutes) |
$20 |
Teen Virtual Discussion Group or Online Bulletin Board (60 minutes) |
$50 (re-engagement bonus: $5) |
Parent/Caregiver Web Survey Activity (30 minutes) |
$40 |
Parent/Caregiver Virtual Interview (30 minutes) |
$50 (re-engagement bonus: $10) |
Parent/Caregiver Online Bulletin Board (60 minutes) |
$100 (re-engagement bonus: $10) |
Parent/Caregiver “At-Home” Materials Review (15 minutes) |
$15 |
Tokens of appreciation are disbursed in the form of electronic payments equivalent to a paper check. These amounts are consistent with amounts offered for similar, ongoing information collection efforts with parents/caregivers2,3 and youth.3,4 To incentivize participation over time, the project team offers a re-engagement bonus of $5 and $10 for youth and parent/caregiver team members, respectively, for any subsequent audience feedback activity after their first. (This strategy has been used in previous information collection with several follow-up surveys.5)
The project team worked with a professional recruitment vendor to recruit participants to take part in the audience feedback activities. This professional recruitment vendor builds and manages its own database of thousands of potential participants, each of whom has voluntarily opted-in to be part of the vendor's panel. In joining the panel, each of these individuals has agreed to be contacted about upcoming projects of potential interest. Based on past audience feedback activities for the ABCD Study6, the project team may also engage with non-profit organizations to help with recruitment of individuals from groups that may be difficult to reach through opt-in panels (such as American Indian and Alaska Native youth and parents). Once engaged, those individuals are referred to the recruitment vendor who will manage the consent process, reminders, and disbursing tokens of appreciation.
When participants are recruited for these audience feedback teams, they are made aware that these teams include multiple feedback opportunities over several years and that their contact information will need to be retained for as long as they are interested in participating. During any data collection activity, participants are allowed to skip any question (i.e., limit the data they share). Participants receive reminders only for the purpose of completing any pre-discussion activity (for youth), for attending their scheduled session, or to participate in the discussion board (for parents/caregivers). Feedback team members are only re-contacted with notifications about upcoming feedback opportunities.
Access to personally identifiable information is limited only to those staff who need access to carry out recruitment, scheduling, obtaining and verification of consent/assent forms, and disbursement of incentives. Similarly, access to project data (even de-identified) is limited to the vendor and NIH/NIDA project staff. Feedback team members are made aware during the consent process that the information they share will be used by the ABCD Study team, and that NIH/NIDA project staff may wish to observe a data collection session (but only with participant permission). During real-time data collection activities, the moderator will remind feedback team members to only use first names or nicknames and avoid mentioning potential identifiers (such as the name of their town or school). Any potentially identifying information mentioned in the data is redacted in the dataset, transcripts, and notes.
Given the national recruitment reach, it is extremely unlikely that any team members know each other outside of the audience feedback teams.
The ABCD Study asks about topics that may be sensitive (such as sexual behavior, substance use, and experience of violence) because these topics can affect or be affected by the cognitive development of adolescents. Soliciting feedback to create the most effective versions of these questions and Study communications materials means that the Study can gather high
quality data on the links between those experiences and their effects on developing brains and bodies.
For this audience feedback activity, we do not ask any participants to answer Study questions. Instead, we ask for their thoughts and feelings about the questions themselves, if any changes need to be made to those questions, and if they would be comfortable responding. Although we are not asking participants to share their personal experience with things like discrimination, violence, or crime, there is the chance that seeing a mention of those subjects may bring up negative memories of those experiences. All feedback collection instruments (e.g., consent forms) explain the types of questions feedback team members will be asked to review, why those questions are being asked, and that team members can choose to decline to provide feedback on specific questions or topics with no penalty.
Based on past audience feedback activities, we recruited 36 youth and 15 parent/caregiver team members to ensure that sufficient individuals are available to participate in any data collection session. The recruitment vendor invited parents who are members of an existing panel and responded to a call for parents of 17- and 19-year-olds. The vendor asked permission to collect demographic information and for permission to contact the youth respondent. For 36 youth feedback team members, we needed to screen up to 72 parents and 72 youth. The 15 parent/caregiver feedback team members were recruited from those 72 parents/caregivers screened in the identification of youth feedback team members.
Table 12-1 Estimated Annualized Burden Hours
Form Name |
Type of Respondents |
No. of Respondents |
No. of Responses per Respondent |
Average Burden per Response (in hours) |
Total Annual Burden Hours |
Att A - Parent or Caregiver Phone Screener and Invitation |
Individuals |
72 |
1 |
5/60 |
6 |
Att B – Parent or Caregiver Consent |
Individuals |
15 |
1 |
5/60 |
1 |
Att C - Parent or Guardian Permission for Teen Participation |
Individuals |
36 |
1 |
5/60 |
3 |
Att D - Teen Phone Screener |
Individuals |
72 |
1 |
5/60 |
6 |
Att E - Teen Assent or Att F – Teen 18+ Consent |
Individuals |
36 |
1 |
10/60 |
6 |
Att G - Teen Web Survey |
Individuals |
36 |
2 |
30/60 |
36 |
Att H – Parent or Caregiver Web Survey |
Individuals |
15 |
2 |
30/60 |
15 |
Att I - Teen Virtual Group Discussion Guide or Att J – Teen Online Bulletin Board Guide |
Individuals |
36 |
2 |
1 |
72 |
Att K- Parent or Caregiver Virtual Interview Guide |
Individuals |
15 |
1 |
30/60 |
8 |
Att L – Parent or Caregiver Online Bulletin Board Guide |
Individuals |
15 |
1 |
1 |
15 |
Att M – Parent or Caregiver “At-Home” Materials Review |
Individuals |
15 |
1 |
15/60 |
4 |
Total |
|
|
450 |
|
172 |
Table 12-2. Annualized Cost to Respondents
Type of Respondent |
Total Burden Hours |
Hourly Wage Rate* |
Total Respondent Costs** |
Parent/Caregiver |
52 |
$31.48 |
$1,637 |
Youth |
120 |
$7.25 |
$870 |
Total |
172 |
|
$2,507 |
*Source: Parents/Caregivers: Bureau of Labor Statistics, May 2023 National Occupational Employment and Wage Estimates, mean Hourly Wage Rate for “All Occupations” - https://www.bls.gov/oes/current/oes_nat.htm#00-0000; Youth: Federal Minimum Hourly Wage rate
** Total rounded to the nearest whole dollar.
No capital, start-up, operating, or maintenance costs are associated with this information collection.
The total estimated annualized cost to the Federal Government for this information collection is $22,261 as shown in below (rounded to the nearest whole dollar).
Cost Descriptions |
Grade/Step |
Salary** |
% of Effort |
Fringe (if applicable) |
Total Cost to Gov’t |
Federal Oversight |
|
|
|
|
|
Scientific Program Manager |
GS-14/3 |
$148,689 |
3% |
|
$4,461 |
Contractor Cost |
|
|
|
|
$17,800 |
|
|
|
|
|
|
Travel |
|
|
|
|
|
Other Cost |
|
|
|
|
|
|
|
|
|
|
|
Total |
|
|
|
|
$22,261 |
*the Salary in table above is cited from https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2024/DCB.pdf
To allow for biannual revisions of stimuli as pre-tests are completed and new pre-tests are needed, Attachments G-L have been revised to include more generic questions that apply across stimuli of the same type (visuals, informational study materials and directions, survey questions). Survey instruments (Attachments G and H) have been updated with new instructions for the text-and graphic-highlight tools, which better facilitates respondent feedback while minimizing burden.
Across instruments that included race or ethnicity (Attachments A, D, G, and H), questions have been updated to reflect the combined race/ethnicity question with revised answer choices (as per OMB Statistical Policy Directive No. 15). IRB-stamped versions of the permission, consent, and assent forms (Attachments B, C, E and F) are also submitted in this revision.
Some parent/caregiver research may take place in Spanish. A Spanish screener (Attachment A), consent forms (Attachments B & C) and a parent/caregiver virtual interview guide (Attachment K) are available in Spanish in addition to English.
To prepare for Fall 2024 information collection and future rounds of information collection, the instruments have been updated as follows:
Instrument |
Revisions |
Parent or Caregiver Phone Screener and Invitation REVISED (Att. A) |
|
Parent/Caregiver Consent REVISED (Att. B) |
Submitted updated version with OMB number/expiration date and IRB stamp. |
Parent/Caregiver Consent REVISED (Att. B) Parent/Guardian Permission for Teens REVISED (Att. C) |
Submitted updated version with OMB number/expiration date and IRB stamp. |
Teen Phone Screener and Invitation REVISED (Att. D) |
|
Teen Assent REVISED (Att. E) Teen Consent REVISED (Att. F) |
Submitted updated version with OMB number, expiration, and IRB stamp. |
Teen Web Survey REVISED (Att. G) |
Stimuli updates:
Additions:
Other revisions:
|
Parent/Caregiver Web Survey REVISED (Att. H) |
Stimuli updates:
Additions:
Other revisions:
|
Teen Virtual Discussion Group Guide REVISED (Att. I) |
Stimuli updates:
Other revisions:
|
Parent/Caregiver Virtual Interview Guide REVISED (Att. J) |
|
Teen Online Bulletin Board Guide REVISED (Att. K) |
|
Parent/Caregiver Online Bulletin Board Guide REVISED (Att. L) |
|
Parent/Caregiver “At-Home” Materials Review REVISED (Att. M) |
Added OMB number and expiration date. |
The project schedule is shown in the table below. Future development and activities are dependent on the timely completion of the proposed information collection.
Table A.16-1 Project Time Schedule
Activity |
Time Schedule |
Solicit Items from ABCD Study Team for Fall Audience Feedback |
June – September |
Fall Audience Feedback Data Collection |
Approximately October – November |
Solicit Items from ABCD Study Team for Spring Audience Feedback Activities |
November – January |
Spring Audience Feedback Data Collection |
Approximately February – March |
Analysis & Reporting |
April – May |
NIDA is not requesting an exemption for display of the OMB expiration date and is also not seeking OMB approval to exclude the expiration date for this information collection. The OMB approval and expiration date will be displayed on the relevant materials associated with the information collection.
There are no exceptions to the certification.
1 Audience Feedback to Inform the 7th Year Follow-Up Protocol of the Adolescent Brain and Cognitive Development (ABCD) StudySM (2023), OMB No. 0925-0766 (Exp. 9/30/2026)
2 Investment Fraud Conversation Starter Kit Testing – Online Panel (Bulletin Board) (2019), OMB No. 3038-0107 (Exp. 5/31/2026)
3 Audience Feedback to Inform the 7th Year Follow-Up Protocol of the Adolescent Brain and Cognitive Development (ABCD) StudySM (2023), OMB No. 0925-0766 (Exp. 9/30/2026)
4 Survey of Youth Transitioning from Foster Care (2020), OMB No. 0970-0546 (Exp. 4/30/2022)
5 Supporting Youth to be Successful in Life (SYSIL) Study (2023), OMB No. 0970-0574 (Exp. 7/31/2024)
6 Audience Feedback to Inform the 7th Year Follow-Up Protocol of the Adolescent Brain and Cognitive Development (ABCD) StudySM (2023), OMB No. 0925-0766 (Exp. 9/30/2026)
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Palmquist, Lanette (NIH/NIDA) [E] |
File Modified | 0000-00-00 |
File Created | 2024-10-06 |