Supporting Statement Part A_updated

Supporting Statement Part A_updated.docx

Apprenticeship Evidence-Building Portfolio Evaluation

OMB: 1290-0041

Document [docx]
Download: docx | pdf

part a: justification for Apprenticeship Evidence-Building Portfolio evaluation

omb CONTROL nO.: 1290-0NEW

OMB EXPIRATION DATE: TBD


PART A: JUSTIFICATION

The Chief Evaluation Office of the U.S. Department of Labor (DOL) commissioned the high priority Apprenticeship Evidence-Building Portfolio evaluation contract to build evidence on apprenticeship, including apprenticeship models, practices, and partnership strategies in high-growth occupations and industries. DOL’s initiatives to expand access to apprenticeship opportunities support the Presidential Executive Order “Expanding Apprenticeships in America.” The portfolio of initiatives addressed by the evaluation includes the Scaling Apprenticeship Through Sector-Based Strategies grants, Closing the Skills Gap grants, Youth Apprenticeship Readiness grants, and other DOL investments. The Urban Institute and its partners Mathematica Policy Research and Capital Research Corporation were contracted to conduct the study of these efforts.


This package requests clearance for nine data collection instruments for three different studies under the aforementioned Apprenticeship Evidence-Building Portfolio evaluation: 1) an implementation evaluation of the Scaling Apprenticeship and Closing the Skills Gap grants programs to develop typologies of apprenticeship models and practices, identify perceived promising strategies across the portfolio, and to better understand the implementation of models to help interpret impact evaluation findings; 2) an assessment of registered apprenticeship state systems and partnerships to provide important information on their capacity to develop, design, modify, implement, sustain, expand/scale up, and evaluate apprenticeship strategies and models; and 3) an implementation evaluation of the Youth Apprenticeship Readiness grant program to understand service delivery design and implementation, and perceived challenges and promising practices.

  1. Scaling Apprenticeship and Closing the Skills Gap Grants survey of grantee staff

  2. State System Capacity Assessment semi-structured interview protocol for state staff

  3. State System Capacity Assessment semi-structured interview protocol for local lead organization staff

  4. State System Capacity Assessment semi-structured interview protocol for local partner staff

  5. State System Capacity Assessment semi-structured interview protocol for employer partner staff

  6. Youth Apprenticeship Readiness Grant survey of program staff

  7. Youth Apprenticeship Readiness Grant semi-structured interview protocol for program staff

  8. Youth Apprenticeship Readiness Grant semi-structured interview protocol for program partners

  9. Youth Apprenticeship Readiness Grant semi-structured interview protocol for follow-up with program staff



1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


The Department of Labor and industry have invested billions of dollars over the past decade to encourage, develop and expand industry-driven apprenticeship training nationwide. Much of the federal investment is through program grants and technical assistance. The breadth of apprenticeship investments has resulted in a diverse sectoral, geographic, and institutional mix of apprenticeship programs and projects. This project will build the evidence base on apprenticeship in three ways: careful review of existing evidence and information; rigorous implementation study to specify apprenticeship typologies and models to include a range of work-based training; and development of rigorous impact evaluation design options to analyze impacts of various models and strategies.


The Scaling Apprenticeship Through Sector-Based Strategies grants ($183.8 million) and the Closing the Skills Gap grants ($100 million) are the two largest recent federal apprenticeship investments and a primary focus of the proposed project. The Scaling Apprenticeship grant awards, announced in June 2019, focus on accelerating expansion of apprenticeships to more sectors with high demand for skilled workers, namely occupations and industries applying for H-1B worker visas. Closing the Skills Gap awards, announced in fall of 2019, are intended to promote apprenticeship as a method for closing the gap between employer skill demands and the skills of the workforce. The source of funding for both grant programs is fee revenue from Section 414(c) of the American Competitiveness and Workforce Improvement Act of 1998, and a substantial portion of grant funds are required to be spent on training activities. In addition, the Youth Apprenticeship Readiness grants ($42.3 million) were awarded in June 2020.


Although the evidence base on apprenticeship in the U.S. is growing, there are still several key knowledge gaps that are ripe for rigorous evaluations and evidence-building. Policymakers, researchers, evaluators, and practitioners are generally persuaded that apprenticeship has positive net benefits, but the study need more evidence on what models work in specific occupational contexts, for particular subgroups of apprentices. Implementation evaluations are needed to better understand what apprenticeship models and components are most effective for apprentices in various industries and occupations. In addition, the implementation evaluation of the Scaling Apprenticeship and Closing the Skills Gap grants complements and will inform the impact evaluation of these grants that the study team is simultaneously conducting for DOL under the same contract.


Citation of sections of laws that justify this information collection: The Scaling Apprenticeship Through Sector-Based Strategies grants, Closing the Skills Gap grants, and Youth Apprenticeship Readiness grants and subsequent evaluations are funded by a portion of H-1B visa fees, which are authorized under Section 414(c) of the American Competitiveness and Workforce Improvement Act of 1998, which states that “the Secretary of Labor shall . . . award grants to eligible entities to provide job training and related activities for workers to assist them in obtaining or upgrading employment in industries and economic sectors . . . projected to experience significant growth and ensure that job training and related activities funded by such grants are coordinated with the public workforce investment system (29 USC 3224(a)).”


This a new collection request associated with the Apprenticeship Evidence-Building Portfolio.

This package requests clearance for nine data collection activities which need to start in June 2022. A timely start to the information collection is critical for conducting the evaluations.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The data collected through the activities summarized in this request will inform: (1) an implementation evaluation of the Scaling Apprenticeship and Closing the Skill Gaps grants programs to develop typologies of apprenticeship models and practices, identify perceived promising strategies across the portfolio, and to better understand the implementation of models to help interpret impact evaluation findings; (2) a study of registered apprenticeship state systems and partnerships to assess their capacity to develop, design, modify, implement, sustain, expand/scale up, and evaluate apprenticeship strategies and models; and (3) an implementation evaluation of the Youth Apprenticeship Readiness grant program to understand service delivery design and implementation, and perceived challenges and promising practices.


The Apprenticeship Evidence-Building Portfolio evaluation will address the following research questions, organized by study:


Scaling Apprenticeship and Closing the Skills Gap Grants Implementation Evaluation


  1. What apprenticeship components, models, partnerships, and strategies have the Scaling Apprenticeship and Closing the Skills Gap grantees designed and/or expanded?


  1. How have the grantees implemented the components, models, partnerships, and strategies?

  1. What components, models, partnerships, and strategies appear promising for supporting positive outcomes for apprentices, businesses, and systems?


State System Capacity Assessment


  1. What is the capacity and structure of state systems to coordinate the design and implementation of Registered Apprenticeship Programs (RAPs) (including pre-apprenticeship)?


  1. How do national and local initiatives support or hinder the capacity of state systems to expand RAPs?


  1. What partnerships do states engage in to support expansion of RAPs? What partnerships seem successful in supporting expansion of RAPs? What are the advantages and challenges in developing and maintaining these partnerships?


  1. How did states change or adapt their RAP models, strategies, and practices during the COVID-19 crisis? What were the perceived successes and challenges states experienced during this time?


  1. What perceived promising strategies are states using to recruit and place individuals into RAPs?


  1. How do states measure the success of RAPs? What outputs and outcomes for apprentices, employers, and other partners are important?


  1. How do states engage employers and industry to expand RAPs? What are the engagement strategies that show promise for expanding RAPs?


  1. What data infrastructure do states have for RAPs? How are data being used for continuous improvement, cost analysis, or evaluating programs for effectiveness?


  1. What strategies are states developing to ensure the sustainability of the RAPs, especially as federal grant funding sunsets? What sustainability plans seem most likely to support long-term operations?


Youth Apprenticeship Readiness Grant Evaluation


  1. How are youth apprenticeship programs designed and implemented by Youth Apprenticeship Readiness grantees?


  1. What specific activities/strategies are sites using to assist youth in learning about, searching for, and securing apprenticeships?


  1. What are the patterns of placement in apprenticeship opportunities (including pre-apprenticeship), and related successes and challenges?


  1. What types of industries are participating in youth apprenticeship and pre-apprenticeship?


  1. What types of financial incentives, if any, are being used?


  1. How do programs plan for long-term operations with and without future federal funding? What are lessons learned that could inform sustainability?


The evidence generated by the evaluations will be relevant not only to the sites and their partners participating in the DOL initiatives, but to DOL policymakers and administrators assessing current and future apprenticeship initiatives, and to employers, training institutions and workforce development partners seeking knowledge and evidence about effective models, practices, partnerships and strategies to improve and scale their apprenticeship systems.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


Information technology, specifically the software program Qualtrics, will be used to program and administer the two survey data collections. This survey software offers a user interface that is modern, secure, and easy to navigate for respondents. The software will also facilitate generation of tabulations of responses as surveys are completed by grantees and processed.


The surveys will be hosted on the Internet via a live secure web-link. To reduce burden, the surveys will employ the following: (1) secure log-ins and passwords so respondents can save and complete the survey in multiple sessions, (2) drop-down response categories so respondents can quickly select from a list, (3) dynamic questions and automated skip patterns so respondents only see those questions that apply to them (including those based on answers provided previously in the survey), and (4) logical rules for responses so respondents’ answers are restricted to those intended by the question.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


All nine data collection instruments are collecting new data that are not available through alternative sources.


5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


The information collection does not target small businesses or entities; however, the evaluation grantees could be small organizations, such as businesses or nonprofit organizations. If small businesses are involved, only the minimal amount of data needed for this study will be collected.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


If these one-time data are not collected, DOL will not be able to determine the effectiveness of its apprenticeship investments and the various models, programs, components, and strategies being used. Implementation evaluations of the Scaling Apprenticeship, Closing the Skills Gap, and Youth Apprenticeship Readiness grant programs will provide important information on ways to improve apprenticeship models and approaches. In addition, the implementation evaluation of the Scaling Apprenticeship and Closing the Skills Gap grants complements and will inform the impact evaluation of these grants. An assessment of registered apprenticeship state systems and partnerships will provide important information on their capacity to develop, design, modify, implement, sustain, expand/scale up, and evaluate apprenticeship strategies and models. The evidence generated by the study will benefit DOL and its apprenticeship grantees, as well as federal policymakers and administrators assessing current and future apprenticeship initiatives, and employers, training institutions and workforce development partners seeking knowledge and evidence about effective models, practices, partnerships and strategies to improve and scale their systems.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* Requiring respondents to report information to the agency more often than quarterly;


* Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;


* Requiring respondents to submit more than an original and two copies of any document;


* Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;


* In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;


* Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;


* That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or


* Requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


There are no special circumstances for the proposed data collection.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


The 60-day notice to solicit public comments was published in the Federal Register on February 2, 2021 (86 FR 7881). No public comments were received.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


The project includes a Technical Working Group (TWG) to provide substantive feedback throughout the project period. Members of the TWG are listed in Table A.1. They have expertise in research methodology as well as on programs and populations similar to those being served in the apprenticeship grant initiatives.


Table A.1. Technical Working Group Members

Carolyn Heinrich

Patricia and Rodes Hart Professor of Public Policy, Education, and Economics, Vanderbilt University


Susan Helper

Frank Tracy Carlton Professor of Economics at the Weatherhead School of Management, Case Western Reserve University


Chris Magyar

Chief Apprenticeship Officer, Techtonic Inc.


Mary Alice McCarthy

Director of the Center on Education & Skills, New America


Jeffrey Smith

Paul T. Heyne Distinguished Chair in Economics and Richard Meese Chair in Applied Econometrics, University of Wisconsin-Madison


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


There are no payments or gifts to program and partner staff, as activities are expected to be carried out in the course of their employment, and no additional compensation will be provided outside of their normal pay.

10. Describe any assurance of privacy provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


All respondents taking part in data collection activities are assured that information collected will be kept private to the extent permitted by law. Depending on the data collection activity, respondents will read a privacy statement at the start of the survey (see Attachments A and F) or be read a privacy statement at the start of phone, virtual, or in-person interviews (see Attachments B through E and G through I). In each activity, participants are informed that all data will be used for research purposes only, will be kept securely, and individually identifiable data will not be shared with program staff or the Department of Labor. They are also assured no one will ever publish their name in connection with the information collected, but information will be combined with individual data across the study, so researchers can describe the overall program effects, participants’ experiences, and program implementation. Further, all recipients are assured participation is completely voluntary and given the option of not answering any individual question. The evaluation team complies with DOL data security requirements by implementing security controls for processes that it routinely uses in projects that involve sensitive data. Further, the evaluation is being conducted in accordance with all relevant regulations and requirements, including those set out by the Urban Institute IRB.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


These studies do not include any questions of a sensitive nature; however, the Urban Institute’s Institutional Review Board (IRB) requires approval for all data collection activities involving surveys and interviews.


12. Provide estimates of the hour burden of the collection of information.

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under “Annual Cost to Federal Government.”


Table A.2 provides the annualized burden estimates for the data collection activities for which this package requests clearance. The evaluation is requesting clearance for a period of three years. Burden estimates are based on the study team’s experience conducting similar data collections.


For the Scaling Apprenticeship and Closing the Skills Gap Grants Implementation Evaluation, we will conduct a survey with one grantee staff member for each of the 23 Scaling Apprenticeship grants and 28 Closing the Skills Gap grants. The total number of survey respondents will include 51 grantee staff. The number of these respondents is annualized over three years in Table A.2.


For the State System Capacity Assessment, we will conduct virtual semi-structured interviews with sites across approximately 15 states participating in the study. For each site, we will interview approximately 5 staff in state programs, 6 staff in lead organizations, 10 staff in local partner organizations, and 2 staff with employers. The total number of interviews will include approximately 75 staff in state programs, 90 staff in lead organizations, 150 staff in local partner organizations, and 30 staff with employers. The number of these respondents is annualized over three years in Table A.2.


For the Youth Apprenticeship Readiness Grant Evaluation, we will conduct a survey with one program staff member for each of the 14 grants. We will also conduct in-person semi-structured interviews with program staff and partner staff of approximately 9 grants selected from those participating in the study. If in-person visits are not possible, the interviews will be conducted virtually. For each grant, we will conduct interviews with approximately 4 program staff and 6 program partner staff, and follow-up interviews with approximately 2 program staff. The total number of interviews will include 36 program staff, 54 program partner staff, and follow-up interviews with 18 program staff. The number of these respondents is annualized over three years in Table A.2.


Hourly wage for program staff and partners reflects the May 2020 mean hourly wage estimate for “social and community service managers” as reported by the U.S. Department of Labor, Bureau of Labor Statistics.1



Table A.2. Estimated annualized respondent hour and cost burden

Type of Instrument

Number of Respondents

Number of Responses Per Respondent

Total Number of Responses

Average Burden Per Response (in hours)

Estimated Burden Hours

Average Hourly Wage1

Annual Burden Costs

Scaling Apprenticeship and Closing the Skills Gap Grants survey – grantee staff

172

1

17

3.0

51

$36.13

$1,842.63

State System Capacity Assessment interview protocol – state staff

253

1

25

2.0

50

$36.13

$1,806.50

State System Capacity Assessment interview protocol – local lead organization staff

304

1

30

1.0

30

$36.13

$1,083.90

State System Capacity Assessment interview protocol – local partner staff

505

1

50

1.0

50

$36.13

$1,806.50

State System Capacity Assessment interview protocol – employer partner staff

106

1

10

1.0

10

$36.13

$361.30

Youth Apprenticeship Readiness Grant survey – program staff

57

1

5

1.5

7.5

$36.13

$270.98

Youth Apprenticeship Readiness Grant interview protocol – program staff

128

1

12

1.0

12

$36.13

$433.56

Youth Apprenticeship Readiness Grant interview protocol – program partners

189

1

18

1.0

18

$36.13

$650.34

Youth Apprenticeship Readiness Grant interview protocol – follow-up with program staff

610

1

6

1.0

6

$36.13

$216.78

Total

173


173


234.5


$8,472.49

1 Hourly wage for program staff and partners reflects the May 2020 mean hourly wage estimate for “social and community service managers” as reported by the U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wage Estimates, 2020, “May 2020 National Occupational Employment and Wage Estimates United States,” (accessed from the following web site as of January 11, 2022: .https://www.bls.gov/oes/2020/may/oes_nat.htm

2 Assumes 23 Scaling Apprenticeship and 28 Closing the Skill Gap grantee staff annualized over 3 years.

3 Assumes interviews with 5 staff in state programs in up to 15 states staff annualized over 3 years.

4 Assumes interviews with 6 staff in lead organizations in up to 15 states staff annualized over 3 years.

5 Assumes interviews with 10 staff in local partner organizations in up to 15 states staff annualized over 3 years.

6 Assumes interviews with 2 staff with employers in up to 15 states staff annualized over 3 years.

7 Assumes 14 program staff annualized over 3 years.

8 Assumes interviews with 4 program staff in 9 sites staff annualized over 3 years.

9 Assumes interviews with 6 program partner staff in 9 sites staff annualized over 3 years.

10 Assumes follow-up interviews with 2 program staff in 9 sites staff annualized over 3 years.



13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


There are no direct costs to respondents.


14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.


The total cost to the Federal government over three years is $1,737,681, and annualized cost to the federal government is $579,227. Costs result from the following two categories:


  1. The annualized cost to the federal government for the evaluation contractor, the Urban Institute and its partners Mathematica and Capital Research Corporation (Contract Number: DOL-1605DC-19-F-00312, to carry out this evaluation is $566,311. The total cost of the data collection is $698,147 for the base contract and $1,000,787 for the State System Capacity Assessment and Youth Apprenticeship Readiness Evaluation data collection over 3 years. Therefore, the annualized cost is ($698,147+$1,000,787) / 3 = $566,311.


  1. The annualized cost for federal technical staff to oversee the evaluation is $12,916. This is calculated by the following: an annual level of effort of 200 hours for one Washington, DC-based Federal GS-14 step 4 employee earning $64.58 per hour. (See Office of Personnel Management 2021 Hourly Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2021/DCB_h.pdf.)

  2. Therefore, the annualized cost is 200 hours X $64.58 = $12,916.


The total annualized cost to the federal government is $579,227 ($566,311 + $12,916= $579,227).


15. Explain the reasons for any program changes or adjustments reported on the burden worksheet.


This is a new information collection.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Data collection will begin in June 2022 and will end in September 2024. After data collection, data will be presented in summary formats, tables, charts, and graphs to illustrate the results. The final report for the Scaling Apprenticeship and Closing the Skills Gap Grants Implementation Evaluation will be submitted in 2023. Special topics briefs for the State System Capacity Assessment will be submitted in 2022.The final report for the Youth Apprenticeship Readiness Grant Evaluation will be submitted in 2024.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The OMB approval number and expiration date will be displayed or cited on all forms completed as part of the data collection.


18. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.”


No exceptions are necessary for this information collection.

1 See the U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wage Estimates, 2020, “May 2020 National Occupational Employment and Wage Estimates United States.” (accessed from the following web site as of January 11, 2022: https://www.bls.gov/oes/2020/may/oes_nat.htm.



1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMathematica Staff
File Modified0000-00-00
File Created2022-08-30

© 2024 OMB.report | Privacy Policy