1845-0136 Revised Supporting_Statement_Part_A 10012015 clean (2)

1845-0136 Revised Supporting_Statement_Part_A 10012015 clean (2).docx

Gainful Employment Recent Graduates Employment and Earning Survey Pilot Test

OMB: 1845-0136

Document [docx]
Download: docx | pdf

OMB Number: 1845-0136 Revised: 10/1/2015

RIN Number: XXXX-XXXX (if applicable)

SUPPORTING STATEMENT

FOR PAPERWORK REDUCTION ACT SUBMISSION

Gainful Employment Recent Graduates Employment and Earnings Survey Pilot Test


A. Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a hard copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information, or you may provide a valid URL link or paste the applicable section1. Specify the review type of the collection (new, revision, extension, reinstatement with change, reinstatement without change). If revised, briefly specify the changes. If a rulemaking is involved, make note of the sections or changed sections, if applicable.


The National Center for Education Statistics (NCES) of the U.S. Department of Education (ED) is required by regulation to develop an earnings survey to support gainful employment program evaluations (see 34 CFR 668.406 as specified in final regulations published in the Federal Register in October 2014). NCES is responsible for developing the survey and the technical standards to which programs must adhere in its administration. The first round of cognitive interviews for this development effort and recruitment for the second round were approved separately in January 2015 (#1850-0803 v.123 and v. 125, respectively); a separate request to conduct the second round of cognitive testing was approved in March (#1850-0803 v. 130). This request is to conduct a pilot test of the Recent Graduates Employment and Earnings Survey (RGEES). The data collection for this study is being carried out under contract to NCES by RTI International (contract # ED-IES-13-C-0070).


Postsecondary programs subject to the gainful employment regulations may appeal program-level debt-to-earnings ratios calculated by ED (34 CFR Parts 600 and 668). The earnings component of the debt-to-earnings ratio (D/E ratio) is provided by the Social Security Administration, but institutions may calculate an alternative earnings measure by administering a survey to program graduates. Institutions that choose to submit alternate earnings appeal information will survey all students from programs who graduated during the same period that ED used to calculate the D/E ratios, or a comparable period as defined in 668.406(b)(3) of the regulations. The survey will provide an additional source of earnings data for ED to consider before deciding on final D/E ratios for programs subject to the gainful employment regulations. Programs with final D/E ratios that fail to meet the minimum threshold may face sanctions, including the possible loss of Title IV (federal financial aid) program funds.


The regulations specify that the Secretary of Education will publish in the Federal Register a pilot-tested earnings survey and the standards required for its administration. The draft standards are being published for public comment in a separate announcement.

In preparation for this submission, NCES reviewed existing person-level surveys conducted regularly by the United States Census Bureau (Census) and the Bureau of Labor Statistics (BLS) for data collection approaches and item sets, and organized meetings that included these agencies. Representatives from Census included Charles Nelson (Assistant Division Chief of the Economic Characteristics Division at Census) and Alfred Gottschalk (Chief of the Labor Force Statistics Branch at Census). Anne Polivka, Chief of the Statistics Employment Research Division, represented BLS. In addition, Katharine Abraham, former director of BLS and a recent member of the President’s Council of Economic Advisors, was instrumental in revisions to items that are included in this submission. Appendix 1 below provides item source annotations referencing parallel items from the March Current Population Survey (CPS), which is developed and fielded jointly by Census and BLS and used for official poverty statistics, and from the National Longitudinal Study of Youth (NLSY) fielded by BLS.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The RGEES pilot test will measure unit response rates and enable comparisons to earnings data collected through other surveys and in administrative records. The pilot study results will be used to compare median earnings collected through the survey to median earnings for graduates from comparable programs based on a match to the Social Security Administration as part of the 2012 gainful employment informational rates. The results of the pilot will also be compared to earnings estimates in the CPS and the ACS.


If needed based on the pilot test results and public comment, the RGEES will be revised prior to posting on ED’s website on December 15, 2015.


The National Student Loan Data System (NSLDS) is the sample frame for this study. A total of 3,400 sample members will be chosen from among the universe of gainful employment program and for-profit institution graduates who completed their program between July 1, 2009 and June 30, 2011.2 To facilitate comparisons to the 2012 gainful employment informational rates, respondents will be selected in four categories from the Classification of Instructional Programs (CIP) based on the type of program they completed: cosmetology and related personal grooming services (12.04); somatic bodywork and related therapeutic services (51.35); practical nursing, vocational nursing, and nursing assistants (51.39); and all others.3


Overall unit response rates for the RGEES are expected to be at least 50 percent of an identified cohort and, in keeping with the draft standards for its administration, a nonresponse bias analysis (NRBA) will be required when unit response rates are less than 80 percent. At the completion of the pilot study, the NRBA will compare respondents and non-respondents within the program areas targeted. Demographic and other characteristics of both the respondents and non-respondents needed for the NRBA will be obtained from the NSLDS from which the sample will have been selected.


Appendix 2 contains the RGEES questionnaire and Appendix 3 contains respondent contact materials.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or forms of information technology, e.g. permitting electronic submission of responses, and the basis for the decision of adopting this means of collection. Also describe any consideration given to using technology to reduce burden.


The pilot test will be conducted using a mailed pencil and paper instrument and a mobile-friendly web survey. A fully automated survey control system will monitor both the paper and the web survey instruments from initial mailing. High-volume mail preparation hardware and software will automate the mailing process, and hard copy questionnaires returned for processing will be receipted and managed by the automated receipt control and document management system. Once receipted, hard copy surveys will be keyed for data entry, then destroyed. Web survey data will reside within the control system SQL database in RTI’s Enhanced Security Network.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


The purpose of this request for clearance is to pilot test a survey of graduates that will be used by postsecondary programs subject to the gainful employment regulation as part of an appeal of the earnings estimate used in the debt-to-earnings ratio. This is a new option provided to programs as part of the final regulations published in October 2014. This is the first time that these items have been tested for this purpose.


5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden. A small entity may be (1) a small business which is deemed to be one that is independently owned and operated and that is not dominant in its field of operation; (2) a small organization that is any not-for-profit enterprise that is independently owned and operated and is not dominant in its field; or (3) a small government jurisdiction, which is a government of a city, county, town, township, school district, or special district with a population of less than 50,000.


This information collection does not involve small businesses or other small entities.


6. Describe the consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


This information collection is for the pilot of the survey that is included in the regulations at 668.406, D/E rates alternate earnings appeal. If this pilot is not performed, institutions will not be able to use the regulatory options available to them under the debt to earnings rate appeal.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:


  • requiring respondents to report information to the agency more often than quarterly;

  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • requiring respondents to submit more than an original and two copies of any document;

  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;

  • in connection with a statistical survey, that is not designed to produce valid and reliable results than can be generalized to the universe of study;

  • requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

  • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or that unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.


None of the special circumstances listed in the instructions for completing the supporting statement apply to the RGEES.


  1. As applicable, state that the Department has published the 60 and 30 Federal Register notices as required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instruction and record keeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years – even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


ED is requesting and will review comments received from the public during the public comment period. This is the request for the 30-day public comment period.


During the 60-day comment period 4 anonymous public comments were received, none of which offered alternatives to the burden estimates while 2 expressed that the estimate provided was insufficient. Three commenters expressed concern about getting survey information from graduated students. One commenter expressed that ED has this wage information already and should work to stop Pell Grant abuse. One commenter suggested that any student who receives student financial aid should be required to report earnings directly to ED as a condition of the funding. One commenter expressed concern that some institutions may not have the technology necessary to perform this type of survey.


ED is grateful for the thoughtful comments that were provided. In reply to the anonymous commenters we are offering the following response. This pilot survey project is required by regulation. The pilot survey project will lead to the publication of the required survey form and survey standards. The final survey will not be required to be carried out by any institution. Instead this survey is one of two optional activities allowed under regulation for institutions who wish to appeal the program level debt-to-earnings ratios calculated by ED.


Also, ED will be providing a survey platform to facilitate the administration of the survey, processing of the data, and analysis and reporting requirements that apply to programs electing to use the survey approach to an appeal. If an institution does not have the technological capacity to support the survey platform, it may be supported in a secure cloud environment, or through another third party vendor with appropriate privacy protections and agreements in place. Because this is a new program, we do not have firm data as to the number of institutions or the actual time required for completion of this optional activity, and no alternate burden calculations were provided in these anonymous comments. ED has not changed its estimations of either participation or burden based on these comments.


ED received an additional 4 comments with very specific concerns and challenges. The full responses to these comments are attached in separate documents.


BACKGROUND

The initial survey and plans for its testing were developed with input from a Technical Review Panel (TRP) that met on December 2, 2014. The results of the first round of cognitive testing, which was conducted in January and February 2015, were shared with a second panel of earnings experts convened by the National Institute of Statistical Sciences (NISS) on February 27, 2015. The most significant recommendation from the first round of testing was to combine all income questions into a single multi-part question in order to help respondents correctly distinguish sources of income without double counting. Neither the time burden imposed by the survey nor the sensitivity of the items was reported as a concern by participants. The most significant recommendation from the NISS panel was to revamp the questions to help respondents understand the need to report income from all sources and to help them recall their earnings information from two years prior to data collection. To improve recall, the revised survey asked respondents to focus first on the job they held the longest (per approaches tested previously for CPS), and then on all other jobs. This approach was designed to help respondents think about their actual employment situation in 2013, with the goal to elicit total earnings and to not count earnings for the longest job separately. A second round of cognitive testing, conducted in April 2015, found that respondents had a good understanding of the intent of the questions and that double counting was minimized in this new format. A report from both rounds of cognitive testing is included in Appendix 4.


RGEES Technical Review Panel Members (December 2, 2014 - Washington, DC)


Geri Anderson

Special Assistant to the President for External Affairs

Aims Community College

5401 West 20th Street

Greeley, CO 80632

Angela Bell

Senior Executive Director of Research and Policy Analysis

University System of Georgia

270 Washington Street SW

Atlanta, GA 30334

Jennifer Blum

Senior Vice Present, External Relations & Public Policy

Laureate Education, Incorporated

1500 K Street NW, Suite 250

Washington, DC 20005

Kathy Booth

Senior Research Associate

WestEd

300 Lakeside Drive, 25th Floor

Oakland, CA 94612


Patrick Crane

Project Manager

West Virginia Community & Technical College System

1018 Kanawha Boulevard East, Suite 700

Charleston, WV 25301

Mark DeFusco

Senior Research Associate

University of Southern California

Rossier School of Education

Waite Phillips Hall WPH 701B

Los Angeles, CA 90089-4037


Christine Fuglestad

Director of Government Affairs

Capella University

225 South 6th Street, 9th Floor

5019 Wentworth Avenue

Minneapolis, MN 55419

Alfred Gottschalck

Chief, Labor Force Statistics Branch

US Census Bureau

4600 Silver Hill Road

Washington, DC 20233


KC Greaney

Director, Office of Institutional Research

Santa Rosa Junior College

680 Sonoma Mountain Parkway

Richard Call Bldg., Annex

Petaluma, CA 94954

Stephen Haworth

Senior Manager, Reporting & Policy Research

Devry Education Group

3005 Highland Parkway

Downers Grove, IL 60515


G. Scott Jenkins

Vice Provost for Academic Affairs and Undergraduate Programs

North Carolina A&T State University

1601 East Market Street

Dowdy Building 318

Greensboro, NC 27411


Anthony Jones

Graduate Faculty

Appalachian State University

151 College Street, Suite 217-B

Boone, NC 28608

John Kolotos

Policy Analyst

U.S. Department of Education

1990 K Street NW

Washington, DC 20202

Tod Massa

Director, Policy Research and Data Warehousing

State Council of Higher Education for Virginia

101 North 14th Street

Richmond, VA 23219

Heather McKay

Director, Education and Employment Research Center

Rutgers University

94 Rockafeller Road

Piscataway, NJ 08854-8054


Charles Nelson

Assistant Division Chief, Economic Characteristics

Census Bureau

4600 Silver Hill Road

Washington, DC 20233


Kent Phillipe (unable to attend)

Associate Vice President, Research & Student Success

American Association of Community Colleges

One Dupont Circle NW

Suite 410

Washington, DC 20036


Anne Polivka

Supervisory Research Economist, Employment Research Chief

Bureau of Labor Statistics

2 Massachusetts Avenue NE, Suite 4945

Washington, DC 20212


Casey Sacks

Manager

Colorado Community College System

9101 East Lowry Boulevard

Denver, CO 80230

Rajat Shah

Senior Vice President, Student Financial Services

Lincoln Technical Institute

200 Executive Drive, Suite 340

West Orange, NJ 7052

Christine Tracy

Director of Research

Association for Private Sector Colleges and Universities

1100 Connecticut Avenue NW

Suite 900

Washington, DC 20036


Christina Whitfield

Vice Chancellor

Kentucky Community & Technical College System

300 North Main Street

Versailles, KY 40383

Paul Umbach

Professor, Higher Education and Educational Evaluation and Policy Analysis

Department of Leadership, Policy, and Adult and Higher Education

North Carolina State University

300 Poe Hall, Box 7801

Raleigh, NC 27695


National Institute of Statistical Sciences Panel Members (February 27, 2014 – Washington, DC)


Katherine Abraham

Professor of Economics and Survey

Methodology

University of Maryland

1218 LeFrak Hall

College Park, MD 20742


Michael Larsen

Associate Professor

George Washington University

Rome Hall

801 22nd Street NW

Washington, DC 20052


Emilda Rivers

Program Director, Human Resources

Statistics

National Center for Science and

Engineering Statistics

National Science Foundation

4201 Wilson Boulevard

Arlington, VA 22230


Joy Edington

ESSIN Research Analyst/Statistician

National Institute of Statistical

Sciences

1776 Eye Street NW

Washington DC, 20006


Martin Frankel

Professor of Statistics

The City University of New York

Baruch College

One Bernard Baruch Way

(55 Lexington Avenue, at 24th Street)

New York, NY 10010


Clyde Tucker

Principal Researcher, Education

Program American Institutes for

Research

1000 Thomas Jefferson Street NW

Washington, DC 20007


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees with meaningful justification.


Survey respondents will be offered a $25 incentive for completing the paper or web survey during the pilot test, and will be able to choose between receiving the incentive payment either by check, which typically takes up to 4 weeks to be delivered, or by PayPal, sent immediately to an email address provided in the survey, once receipt of the survey is confirmed. Choice of payment method will be included at the end of the survey where participants will be asked to provide an email address if they choose PayPal.


Use of incentives for the pilot survey is recommended for several reasons. First, as overall response rates in survey research are declining, achieving the desired 60 percent response rate will be especially challenging because of the sensitive nature of the earnings questions. Second, the pilot survey has an abbreviated data collection timeframe in order to release the final survey in time to be available in spring when institutions will be able to begin appealing FSA’s debt to earnings calculation. Respondents will have the option to decline the incentive if desired.


10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy. If personally identifiable information (PII) is being collected, a Privacy Act statement should be included on the instrument. Please provide a citation for the Systems of Record Notice and the date a Privacy Impact Assessment was completed as indicated on the IC Data Form. A confidentiality statement with a legal citation that authorizes the pledge of confidentiality should be provided.4 If the collection is subject to the Privacy Act, the Privacy Act statement is deemed sufficient with respect to confidentiality. If there is no expectation of confidentiality, simply state that the Department makes no pledge about the confidentially of the data.


Respondents will be informed that their responses to this data collection will be used only for statistical purposes and that the results of this study will summarize findings across the sample and will not associate responses with a specific individual. The name of the respondent will not be maintained with their responses. Additionally, respondents will be informed that no identifying information will be shared outside the study team (Privacy Act of 1974 5 U.S.C. § 552(a), 2009; Family Educational and Privacy Act of 1974, 20 U.S.C. § 1232(g), 2009).


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. The justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


The RGEES pilot test is a voluntary survey, and no persons are required to respond to it. In addition, respondents may decline to answer any question in the survey. Respondents will be informed of the voluntary nature of the survey in the cover letter that accompanies the questionnaire, as well as on the actual questionnaire. At the same time, survey items verifying personal identity and asking about earnings may be considered sensitive by some respondents. Thus all of the four survey items on the RGEES may be considered sensitive.


The first question asking the respondent to verify his or her identity is needed because earnings are being collected from a defined cohort of graduates negotiated between ED and the program. The mean and median earnings measures that will be submitted to ED as part of the appeals process must be based on actual earnings from the universe of cohort members responding to the survey.


The next three questions ask about sources (from an employer, from self-employment, other) and amounts of earnings during the reference year. These questions are drawn from parallel items from the March CPS, which is developed and fielded jointly by Census and BLS and used for official poverty statistics, and from NLSY fielded by BLS as described in Appendix 1. These items are needed in order to calculate the mean and median of total earnings (and the number of true zeros) for the cohort during the reference period to submit to ED as part of the alternative earnings appeal process.


12. Provide estimates of the hour burden of the collection of information. The statement should:


  • Indicate the number of respondents by affected public type (federal government, individuals or households, private sector – businesses or other for-profit, private sector – not-for-profit institutions, farms, state, local or tribal governments), frequency of response, annual hour burden, and an explanation of how the burden was estimated, including identification of burden type: recordkeeping, reporting or third party disclosure. All narrative should be included in item 12. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

  • If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in the ROCIS IC Burden Analysis Table. (The table should at minimum include Respondent types, IC activity, Respondent and Responses, Hours/Response, and Total Hours)

  • Provide estimates of annualized cost to respondents of the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.


NCES estimates a response rate of 60% and about 5 minutes per respondent to complete the pilot test survey.


Estimate respondent burden for pilot study

Activity

Number of

sampled respondents

Number of responses

Minutes per respondent

Maximum total burden hours

Pilot testing

3,400

2,040

5

170

Study Total

3,400

2,040


170



13. Provide an estimate of the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14.)


  • The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life); and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and acquiring and maintaining record storage facilities.

  • If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

  • Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government or (4) as part of customary and usual business or private practices. Also, these estimates should not include the hourly costs (i.e., the monetization of the hours) captured above in Item 12


Total Annualized Capital/Startup Cost :      

Total Annual Costs (O&M) :      

____________________

Total Annualized Costs Requested :      


There are also no recordkeeping requirements and no costs to respondents beyond the time to participate.


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.


The cost to the federal government for developing and cognitive and pilot testing the survey is $555,570, which includes contractor staff time, incentives, and project materials.


15. Explain the reasons for any program changes or adjustments. Generally, adjustments in burden result from re-estimating burden and/or from economic phenomenon outside of an agency’s control (e.g., correcting a burden estimate or an organic increase in the size of the reporting universe). Program changes result from a deliberate action that materially changes a collection of information and generally are result of new statute or an agency action (e.g., changing a form, revising regulations, redefining the respondent universe, etc.). Burden changes should be disaggregated by type of change (i.e., adjustment, program change due to new statute, and/or program change due to agency discretion), type of collection (new, revision, extension, reinstatement with change, reinstatement without change) and include totals for changes in burden hours, responses and costs (if applicable).


This is a request for a new information collection. This collection is necessary to meet regulatory requirements in 34 CFR 668.406. This new collection is a program change due to agency discretion. ED is requesting an increase in 170 burden hours to 2,040 individuals.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


The pilot study will begin in early September and end in late October 2015. All RGEES development activities to analyze the data and finalize the survey need to be completed by the middle of November. If needed based on the pilot test results and public comment, the RGEES will be revised prior to posting on ED’s website by December 15, 2015.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The OMB authorization number and expiration date will be displayed on the paper and electronic survey forms.


18. Explain each exception to the certification statement identified in the Certification of Paperwork Reduction Act.


There are no exceptions to the certification statement.


1 Please limit pasted text to no longer than 3 paragraphs.

2 For the pilot study, graduates from programs in Puerto Rico will be excluded since a Spanish translation of the survey is not yet available.

3 Categories include all six digit CIP code programs within the referenced four digit CIP code.

4 Requests for this information are in accordance with the following ED and OMB policies: Privacy Act of 1974, OMB Circular A-108 – Privacy Act Implementation – Guidelines and Responsibilities, OMB Circular A-130 Appendix I – Federal Agency Responsibilities for Maintaining Records About Individuals, OMB M-03-22 – OMB Guidance for Implementing the Privacy Provisions of the E-Government Act of 2002, OMB M-06-15 – Safeguarding Personally Identifiable Information, OM:6-104 – Privacy Act of 1974 (Collection, Use and Protection of Personally Identifiable Information)


Shape1

16


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT
AuthorKenneth Smith
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy