ERS-MSU cc 1-ss Part A 28Apr23

ERS-MSU cc 1-ss Part A 28Apr23.docx

Generic Clearance for Survey Research Studies

ERS-MSU cc 1-ss Part A 28Apr23

OMB: 0536-0073

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT A


PAYING FOR COVER CROPS: DOES EXPERIENCE CHANGE FARMER INCENTIVES? COGNITIVE INTERVIEWS


  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


The authorizing statute in the Generic Clearance OMB Control No.: 0536-0073 (Exp. Date 04/30/2025) is US Code: 7 USC 2204 Name of Law: General duties of Secretary; advisory functions; research and development.


The mission of the USDA Economic Research Service (ERS) is to conduct high-quality, objective economic research to inform and enhance public and private decision making. This Information Collection Request (ICR) is part of a larger project supported by an ERS Strategic Priority Grant that aims to provide new information about farmers’ cover crop practices and willingness to participate in cover crop contracts such as those administered by the Natural Resource Conservation Service (NRCS) and other federal, state, and local agencies. The recent Partnership for Climate Smart Commodities invested $2.8 billion in projects to implement climate smart practices in agricultural production, including incentives to expand long-term adoption of cover crops.


Little is known about the behavioral responses that farmers may have to changes in contracts, such as payment rates, contract length, and the cover crop practice standard. Additional research is needed to better understand how attributes of contract design and participants’ history of program participation affect the payments required to induce new or continued cover cropping. As part of this study, a new survey instrument will be developed to estimate farmers’ preferences for alternate contracts or contract extensions for cover crops. This will allow us to develop estimates of trade-offs between different types of contracts and estimate the potential supply of land for cover crops.


Before implementing a new survey, additional investigation is needed to assess the quality, effectiveness, and accuracy of information collected through the questions. This proposed IC is for conducting cognitive interviews that will be used to pretest a draft survey instrument and support the development of a final instrument. The main focus of the interviews will be on farmers’ perceptions of the alternate contract scenarios to be evaluated in the choice experiment questions. However, the interviews will also assess respondents’ understanding of questions about farm management, cover crop practices, environmental attitudes and values, and demographics as well as identify any difficulties associated with them in order to reduce cognitive burden of the final survey instrument.


Participants will be recruited by contacting NRCS officials1. The NRCS officials will provide contact information for farmers who meet eligibility requirements based on experience with cover crops. Additional details on the sample, eligibility requirements, and recruitment are found in Supporting Statement B.


The results of the first set of cognitive interviews will be used to inform subsequent sets of interviews with updates to the survey questions that are being tested, following a non-substantive change to the information collection for each set of interviews. The questionnaire will be finalized through an iterative process of 8-9 rounds of interviews, with 5-6 interviews per round and no more than 45 interviews total. Each interview will be conducted by a member of the project team and attended by at least one other member of the team. At the end of the interview process, we expect to have a nearly final draft of a survey instrument.



  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The purpose of this proposed information collection is to conduct cognitive interviews to pretest a new draft survey instrument in development that will be used to estimate farmers’ willingness to enroll in cover crop contracts. The cognitive interviews will be used to assess respondents understanding of survey questions, identify potential difficulties, and ensure that the survey will provide the intended information while minimizing respondent burden2. In particular, we will use the interviews to assess understanding of various features of cover crop contracts and the impact of alternate methods of presenting contracts to the respondents.


Our target population are those who have or may be likely to adopt cover crops. We will interview primarily row-crop farmers who are engaged with NRCS, and the criteria used to stratify our sample will include prior experience with Environmental Quality Incentives Program (EQIP) and Conservation Stewardship Program (CSP), as well as prior experience with cover crops. We will aim to conduct cognitive testing on farmers who have done cover crops under EQIP or CSP, farmers who are familiar with NRCS programs but have not done cover crops, and farmers who are not familiar with NRCS programs or cover crops. This will ensure that the survey instrument is meaningful to the entire population of respondents.


The research team will conduct a series of cognitive interviews that will iteratively improve questionnaire design. After each round of individual cognitive interviews, results will be assessed, and non-substantive changes will be made to the instrument.


These qualitative cognitive interviews will be conducted with a convenience sample of row-crop farmers that represent a range of experience with cover crops and a range of experience with federal conservation programs such as EQIP and CSP. The study team will work with NRCS officials to identify a sample of farmers in six Midwestern states selected for their dominant cropping systems and ability to increase cover cropping:

  • Iowa (predominantly corn-soy cropping systems)

  • Indiana (predominantly corn-soy cropping systems)

  • Missouri (significant corn-soy production but with greater crop variability)

  • Michigan (corn-soy cropping systems with significant livestock production)

  • Wisconsin (corn-soy cropping systems with significant livestock production)

  • Kansas (significant corn-soy production with some wheat rotations conducive to adding a cover crop)


We focus on these states because the survey asks about cover cropping in corn-soy systems, and we expect that the ability to use cover crops for livestock forage has a significant impact on adoption. These states are also representative of the geographic area that we will sample for the full information collection (Midwestern states with significant corn-soy production). Farmers will be contacted by telephone and by email for participation in the cognitive interviews. The interviews will be conducted online using screen-sharing technology on ERS computers, and for each individual this will be a one-time collection.


The result of this information collection will be a final survey instrument, to be used in a subsequent information collection following OMB review and approval. Information will be shared with the research team under a cooperative agreement at Michigan State University and partners at NRCS. Cognitive interview results will not be published and only referenced in final publications as a pretesting technique.



  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


Cognitive interviews will be conducted online using screen sharing technology (Microsoft Team or Zoom) using ERS computers. The draft survey will be hosted on Qualtrics, a FedRAMP certified and approved platform. Pre-testing respondents will go through a shortened, draft version of the study questionnaire on Qualtrics while sharing a screen with the interviewer. Throughout the interview, the respondent will answer interview questions verbally through the Teams or Zoom interview. Allowing interviews to take place online will reduce respondent burden by allowing them to choose the time and place to participate in the interview, and it will allow more efficient collection of information from multiple respondents.





  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.



There are several federal surveys that collect information on cover crops and other conservation practice adoption. The research team is not aware of any federal studies that have collected information on farmer preferences for the different features of cover crop contracts using trade-off methods such as choice experiments. Most existing federal surveys that collect cover crop information are not targeted at existing or likely cover crop adopters but rather collect more broad information about cropping practices and farm management with limited space devoted to cover crop management questions. Since cover crop adoption remains relatively rare, the share of cover crop adopters in most federal agricultural surveys is small, which limits the ability to analyze drivers of cover adoption. More importantly, these experiments only collect observational data restricting what is known to existing contracts, and variation in the drivers of cover crop adoption is not random. The Agricultural Resource Management Survey (ARMS) Phases II and III are annual surveys that collect details on field and farm-level practices, respectively. However, space devoted to cover crop practices is limited and of the total responses, only roughly 5-10 percent do cover crops. The Conservation Effects Assessment Project (CEAP) Survey also collects field-level information on practices but is not conducted annually and also has a small percentage of respondents engaged in cover crops. The Conservation Practice Adoption Motivations Survey (CPAMS) is a farm-level survey of a broader population that explores motivations for adoption and disadoption of various conservation practices. No existing federal information collection focuses specifically on cover crop adopters, and no information collection allows for the estimation of behavioral responses to changes in cover crop contracts.



  1. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


For this information collection we will survey individuals who own and operate farms that qualify as small businesses. We will minimize respondent burden by utilizing an online version of the survey which will allow respondents to choose the time and place for participating in the interview. The online survey will utilize streamlined options and question formatting to reduce time in the survey. Interviews will also be limited to 60 minutes to reduce time burden. In order to ensure that interviews do not exceed 60 minutes, the research team may prioritize and only test sub-sections of the survey. Research on program participation and adoption of cover crops will contribute to broaden knowledge of farmer preferences, potential barriers to adoption, and means to increase adoption.





  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Qualitative pretesting of survey instruments through cognitive interviews follows established best practices for development of economic preference studies34 and supports the production of useful statistics from federal data collections5.


This information collection will follow federal statistical policy directives on cognitive interviews to assess respondents’ understanding of choice scenarios, ensure the feasibility of meeting study objectives through design of the survey questions, and identify other potential obstacles to implementation of the final survey in a subsequent information collection.


Pre-testing the survey is key to ensure the effectiveness of the final survey and is a common practice in survey methodology. Failure to conduct pretesting through cognitive interviews may result in unnecessary respondent burden, respondent confusion, excessive expenditures, and duplication of efforts. If this study is not conducted, the USDA would lack information about responses to changes in incentives when considering changes to or expansions of current conservation programs, or the development of new programs.



  1. Explain any special circumstances that would cause an information collection to be conducted in a manner:

requiring respondents to report information to the agency more often than quarterly;

requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

requiring respondents to submit more than an original and two copies of any document;

requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


This information collection does not involve any special circumstances. All responses will be one-time responses.



  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.




Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.

Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


An FRN is not required for this information collection. The ERS and NRCS have an inter-agency agreement that outlines cooperation and input on design of the survey instrument that is being routed for signature by NRCS Chief Terry Cosby.


This study is conducted under a cooperative agreement with researchers at Michigan State University (MSU). Members of the study team at MSU are experts on choice survey design, cognitive interview techniques, and development and implementation of farmer surveys.


  • Dr. Frank Lupi, Professor, Department of Agricultural, Food, and Resource Economics and Fisheries and Wildlife Department, Michigan State University

  • Dr. Scott Swinton, Chairperson and University Distinguished Professor, Department of Agricultural, Food, and Resource Economics, Michigan State University

  • Dr. Matthew Gammans, Assistant Professor, Department of Agricultural, Food, and Resource Economics, Michigan State University

  • Dr. Ying Wang, Postdoctoral Researcher, Department of Agricultural, Food, and Resource Economics, Michigan State University


The project team has consulted with NRCS, including:


  • Amanda Branham, Director, Soil Health Division

  • Betsy Dierberger, USDA NRCS National Agronomist

  • Noller Herbert, Deputy Chief of Science and Technology

  • Julie Suhr-Pierce, National Economist

  • Luis Tupas, Deputy Chief, Soil Science and Resource Assessment

  • Mark Xu, Director, Resource Inventory and Assessment Division



  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


Following standard practices in the literature, cognitive interview participants will be provided with payments of $50. Participants will be given a choice to be paid by PayPal or check from Michigan State University. Incentive payments are recommended by Dillman, et al. (2014)6 to increase response rates during pretesting. This helps to ensure that the pretests are as representative as possible of the target population. Payments of $50 correspond to the estimated respondent cost per hour using the U.S. Bureau of Labor Statistics Occupational Employment and Wages for Farmers, Ranchers, and Other Agricultural Managers.



  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy. If the collection requires a systems of records notice (SORN) or privacy impact assessment (PIA), those should be cited and described here.


Respondent data will be protected by the Confidential Information Protection and Statistical Efficiency Act of 2018 (CIPSEA). Participants will consent to the study at the start of the cognitive interview. They will be allowed to refuse specific information or terminate the interview at any time. Personal information that could be used to identify individuals will only be used within the cognitive interviews. Names, phone numbers, and emails of respondents will only be used to recruit participants, and they will be destroyed at the end of the study. Contact information for respondents will be treated as PII and email correspondence with potential respondents will be conducted from a federal email address. Respondents will use the FEDRAMP certified Qualtrics environment configured for the USDA and all analyses will take place in limited access folders on CIPSEA approved USDA servers. Responses will be removed from the Qualtrics environment at the end of the project and kept according to ERS data storage and retention guidelines. In addition, all responses to this study will only be used to improve a survey instrument and will not be referenced in any publication or report.


The following pledge will be placed on all instruments:


The information you provide will be used for statistical purposes only. Your response will be kept confidential and any person who willfully discloses ANY identifiable information about you or your operation is subject to a jail term, a fine, or both. This survey is conducted in accordance with the Confidential Information Protection and Statistical Efficiency Act of 2018, Title III of Pub. L. No. 115-435, codified in 44 U.S.C. Ch. 35 and other applicable Federal laws. 



  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


The survey instrument will include a question on gross cash farm income. There will be three broad response categories for this question (Less than $350,000; $350,000 to 999,999; Above $1,000,000). The purpose of this question is to provide information about the circumstances under which cover crop adoption occurs, and how small, medium, and large operations may approach decisions differently and follow designations for three categories used by USDA to describe farms.


This ICR only covers cognitive interviews for pre-testing survey questions. All questions will be asked only for the purpose of ensuring understanding, minimizing future respondent burden, and improving questionnaire design for the final survey instrument which will be implemented pending future OMB review and approval. No information from this collection will be reported or published.


We will provide these explanations to respondents and obtain their written consent prior to the cognitive interviews. They may choose not to answer any question or terminate the interview at any time. All data will be maintained in accordance with CIPSEA and relevant statutes, as well as Departmental and Agency policies.



  1. Provide estimates of the hour burden of the collection of information. The statement should:

Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under ‘Annual Cost to Federal Government’.


Table 1 Estimated Burden Hours for Respondents and Non-Respondents



Respondents

Non-Respondents



Sample Size

Count

Min. / Response

Burden Hours

Count

Min. / Response

Burden Hrs.

Total Burden Hrs.

Participant recruitment

450

90

5

7.5

360

2

12

19.5

Invitation to survey

90

60

5

5

30

2

1

6

Follow-up (reminder) invitation

60

50

5

4.17

10

2

0.33

4.5

Link to interview

50

46

2

1.53

4

2

0.13

1.67

Informed consent

46

45

10

7.5

1

2

0.03

7.53

Interview

45

45

60

45

0

0

0

45

Total



87

70.7



13.5

84.2


Table 2 Estimated Burden Cost


Count

Min. / Response

Hr. / Response

Total Burden Hrs.

Estimated Cost Per Hour ($/Hr)

Total Burden Cost ($)

Respondents

45

87

1.45

65.25

47.82

3,120.26

Non-respondents (recruitment)

360

2

0.03

12

47.82

573.84


Non-respondents (invitation)

30

7

0.12

3.5

47.82

167.37

Non-respondents (follow-up)

10

12

0.2

2

47.82

95.64

Non-respondents (link)

4

17

0.28

1.13

47.82

54.20

Non-respondents (informed consent)

1

19

0.32

0.32

47.82

15.14

Total

450



84.2


4026.44

Note: Cost per hour for the sample was derived by using U.S. Bureau of Labor Statistics Occupational Employment and Wages, May 2020, 11-9013 Farmers, Ranchers, and Other Agricultural Managers.



A sample of 450 potential respondents is by NRCS officials to farmers with whom they have an existing relationship. Estimated burden is 5 minutes for the respondents (90) and 2 minutes for non-respondents (360) to recruitment.


For the 90 respondents, the research team will extend an invitation to the interview as well as a follow-up or reminder invitation if necessary. Estimated burden is 5 minutes for the respondents to the invitation (60) and 5 minutes for respondents to the follow-up (50 maximum; for the purposes of respondent burden calculation, we estimate 50). For those that drop out during the invitation process, we estimate that a burden of 7 total minutes non-respondents to the invitation (30) and 12 total minutes for non-respondents to the follow-up (10).


The 50 respondents to the invitations would receive an email link to a video call to participate in the survey. Estimated burden is 2 minutes for respondents (46). For those that do not join their scheduled interview (4), we estimate a burden of 17 minutes total.


Estimated burden is 10 minutes for respondents to informed consent (45) and for those who choose to terminate the interview before informed consent we estimate a total burden of 19 minutes (1).


In total, the estimated response rate is 10% (a sample size of 450 and 45 participants). Respondents would participate in a 60-minute cognitive interview where the interviewer asks probe questions about the survey instrument to gauge their understanding and the effectiveness of the survey questions. Total burden for the 45 respondents across all stages of recruitment and participation is 87 minutes.


The total estimated burden hours for respondents and non-respondents who drop out at each stage of recruitment and invitation are shown in Table 2.


Our recruitment process will take place iteratively. This will reduce respondent burden in two ways: first, by limiting the number of interviews being scheduled at one time, we will all interviews are scheduled soon after the invitation is sent. Second, we will stop recruitment for interviews once the survey instrument is finalized, so the estimate of 45 interviews represents a maximum.


Respondent cost per hour for the farmer population was derived by using U.S. Bureau of Labor Statistics Occupational Employment and Wages, May 2020, 11-9013 Farmers, Ranchers, and Other Agricultural Managers7.


The mean household income for Farmers, Ranchers, and Other Agricultural Managers, as measured by the Bureau of Labor, is $36.93 (May 2020). Fringe benefits for all private industry workers are an additional 29.5 percent, or $10.89, resulting in a total of $47.82 per hour. The estimated respondent cost is $4,026.44 including recruitment burden for both responses and non-responses.

We do not expect to contact anyone who is not eligible, given the nature of the sampling, therefore all burden estimates are either for those who are (a) eligible and interview completed or (b) eligible and not interviewed.



  1. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).

The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection.



  1. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.


The total cost to the Federal Government for this study is approximately $25,325. This includes $20,000 of cooperative agreement costs with Michigan State University. $2,250 of the cooperative agreement cost is for participant payments (45 x $50), and the remainder is for personnel who are designing and implementing the study. In addition, there is a cost to the USDA of approximately $5,325 (80 hours of ERS staff time for design of the survey instrument, coordination with NRCS, participation in cognitive interviews, and analysis at $66.56 per hour based on the 2023 General Schedule, Grade 13, Step 3, Kansas City locality with 29.9 percent in fringe benefit costs.)



  1. Explain the reasons for any program changes or adjustments reported on the burden worksheet.


This is a one-time information collection that does not include any program changes or adjustments.



  1. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Statistics and reports as a result of this information collection will be rudimentary and used internally to inform a final version of the questionnaire. These statistics and reports will not be shared with the public through publications. The final questionnaire will be used in a subsequent information pending OMB approval.



  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The agency plans to display the expiration date for OMB approval of the information collection on all instruments.



  1. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.”


The agency is able to certify compliance with all provisions under Item 19 of OMB Form 83-I.

1 We will work with the NRCS Soil Health Division Director Amanda Branham and NRCS regional soil health specialists Drexel Atkisson, David Doctorian, Candy Thomas, and Stan Boltz to identify potential interview participants. We will also work with other NRCS officials only if necessary to increase our sample size.

2 81 FR 29107

3 Kaplowitz, M., F. Lupi, and J. Hoehn, “Multiple‑methods for developing and evaluating a stated preference survey for valuing wetland ecosystems.” Ch. 24 In Questionnaire Development, Evaluation, and Testing Methods, (S. Presser, et al., eds). 503‑524. Wiley:New Jersey. 2004.

4 Johnston, Robert J., Kevin J. Boyle, Wiktor Adamowicz, Jeff Bennett, Roy Brouwer, Trudy Ann Cameron, W. Michael Hanemann et al. "Contemporary guidance for stated preference studies." Journal of the Association of Environmental and Resource Economists 4, no. 2 (2017): 319-405.

5 81 FR 29107

6 Dillman, Don A., Jolene D. Smyth, and Leah Melani Christian. Internet, phone, mail, and mixed-mode surveys: the tailored design method. John Wiley & Sons, 2014.

7 https://www.bls.gov/oes/2020/may/oes119013.htm

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTanner, Sophia - REE-ERS, Kansas City, MO
File Modified0000-00-00
File Created2023-11-20

© 2024 OMB.report | Privacy Policy