REO_DOL OMB Package 2-Part A_June2021_clean

REO_DOL OMB Package 2-Part A_June2021_clean.docx

Reentry Employment Opportunities (REO) Evaluation

OMB: 1290-0035

Document [docx]
Download: docx | pdf

REENTRY EMPLOYMENT OPPORTUNITIES (REO) EVALUATION

OMB Control Number 1290-0NEW

OMB Expiration Date:




SUPPORTING STATEMENT FOR

REENTRY EMPLOYMENT OPPORTUNITIES (REO) EVALUATION


OMB CONTROL NO. 1290-0NEW


This ICR seeks to get OMB clearance for a new data collection for the REO evaluation.


The Chief Evaluation Office (CEO) in the U.S. Department of Labor (DOL) is undertaking the Reentry Employment Opportunities (REO) Evaluation. The overall aim of the evaluation is to determine whether the REO programs improve employment outcomes and workforce readiness for young adults and adults with previous involvement in the criminal justice system. CEO contracted with Mathematica and its subcontractor, Social Policy Research Associates (SPR), to conduct this evaluation. The evaluation will include an implementation study and an impact study. This package requests clearance for four data collection instruments, which we include as supporting documents:


1. Grantee survey1

2. Master site visit interview protocol2

3. Participant focus group protocol

4. Employer focus group protocol


The four data collection instruments are for the implementation study, with the second through fourth to be used during implementation study site visits. The Health Media Lab Institutional Review Board has reviewed and approved these data collection instruments. On October 22, 2019, OMB approved DOL’s request for the collection of a grantee survey from up to 97 grantees as well as the collection of impact study participants’ baseline and contact information (see OMB Approval No. 1290-0026). The current OMB package requests clearance for data collection activities that will be conducted with the REO grantees3 that received grants in 2018 and 2019. These activities include collecting the grantee survey from REO grantees that received grants in 2019 and collecting semistructured interview data, and focus group data from staff and study participants from grantees that received grants in 2018 and 2019. These data will be used to describe program operations.


  1. JUSTIFICATION


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


This evaluation is authorized by Section 169 of the Workforce Innovation and Opportunity Act (WIOA), which authorizes research and evaluations to improve the management and effectiveness of workforce programs and activities such as the REO programs. CEO undertakes a learning agenda process each year to identify departmental priorities for program evaluations. This evaluation was prioritized as part of that process in fiscal year 2017. Division H, Title I, Section 107 of Public Law 115-31, the Consolidated Appropriations Act of 2017, authorizes the secretary of labor to reserve not more than 0.75 percent from special budget accounts for transfer to and use by CEO for departmental program evaluation. Further, 29 USC 3224(a)(1) authorizes the secretary of labor to conduct ongoing evaluation of programs and activities to improve the management and effectiveness of these programs.

Overview of the REO evaluation

Understanding the effectiveness of the DOL-funded programs serving young adults and adults with justice system involvement requires a rigorous evaluation. CEO has contracted with Mathematica and its subcontractor, SPR, to conduct two studies, an implementation and an impact study. The implementation study will provide an overall assessment of the recent REO initiative investments and provide context for the impact study, including a retrospective view of how COVID-19 influenced their operations. The impact study will compare a group of REO program participants to a comparison group of non-REO program participants to build the evidence base about the effectiveness of employment services for young adults and adults with current and former involvement in the justice system. The data collection efforts that are part of this clearance request will involve grantees and subgrantees that received grants in 2018 and 2019.

The REO implementation study will address four key research questions:

  1. How were programs implemented, and what factors were associated with implementation?

  2. What are the variations in the model, structure, partnerships, and services of the REO grants?

  3. How did implementation vary by organization type (such as an intermediary organization that operates in more than one state or a community-based organization) and target population (young adult or adult)?

  4. How do participants experience the program, and what elements do they find most influential?

The REO impact study will focus on REO intermediaries and CBO grantees in selected states and will compare the outcomes of their participants to those of a comparison group of similar individuals who received Wagner Peyser employment services but who did not receive REO services. The impact study will address four key research questions:

  1. What impact does access to REO services have on participants’ employment, earnings, and recidivism outcomes?

  2. To what extent do impacts from access to REO services vary across selected subpopulations, including age group and offense history?

  3. What are the program components or services associated with positive participant outcomes?

Overview of the data collection

Understanding the implementation and effectiveness of the REO grants requires collecting data from multiple sources. For the implementation study, data collection will include key informant interviews with grantees, subgrantees, and partner staff; individual-level program data; a grantee survey; and focus groups with employers and REO program participants. The grantee survey with year 2019 grantees will begin in November 2021. Interviews and focus groups will be conducted during implementation study site visits in September 2021 to grantees participating in the implementation study. For the impact study, the study team will collect outcome data from administrative earnings records and criminal justice system records for all impact study participants. The data collection activities this clearance request includes (1) the grantee survey for the grantees that received grants in 2019 and were not previously included in the first clearance package submitted for this study (see OMB Approval No. 1290-0026); (2) a master site visit interview protocol for interviews with grant administrators, frontline staff, partner staff administrators, and intermediary grant administrators; (3) a participant focus group protocol; and (4) an employer focus group protocol.

Grantee survey. As part of the implementation study, the study team will field an electronic survey to up to 45 grantees to obtain information about the 2019 REO grantees’ approach to project management, recruitment and outreach, and service delivery. This data collection request is in addition to grantee survey data collection that was approved through OMB Approval No. 1290-0026. This survey instrument, which is the same as that approved in the prior OMB submission, includes a set of common questions to lead to insights about variations across grantees and grant programs and to contextualize data from the impact and implementation studies.

Master site visit interview protocol. As part of the implementation study, which will involve about 28 sites,4 the study team will conduct semistructured interviews with grant administrators, intermediary grant administrators, frontline staff, and partner staff administrators to understand how the program has been developed, managed, and delivered. These site visits will occur in 2021.

Participant focus group protocol. As part of the implementation study, the study team will conduct a participant focus group at each of the 28 sites to gather information from program participants. These focus groups will ask about participants’ reasons for enrolling, impressions of the program, and the extent to which the program has helped them prepare for employment.

Employer focus group protocol. As part of the implementation study, the study team will conduct an employer focus group at each of the 28 sites to gather information from employers who partnered with sites. These focus groups will explore how the grantees are meeting the needs of employers.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


DOL will use the data collected through the activities summarized in this request to assess participant outcomes related to employment, earnings, criminal justice involvement as well as systematically describe the REO grantees, including their organization, administration and management, services, delivery structures, and processes; participants served; and common implementation challenges. These data and the study team’s analyses will be a source of important information for DOL and other policymakers to guide management decisions and plan for future grant programs. Without conducting the data collection activities described in this request, a comprehensive evaluation of the REO grant programs cannot occur.

The instruments with which we will collect data for the evaluation and their proposed uses are summarized in Table A.1.

Table A.1. Data collection instruments included in the request

Data collection instrument

How study team will use the data

1. Grantee survey

This survey will serve to gather common information about organizational settings and intervention characteristics for REO grantees that received grant awards during 2019; if this request receives clearance, the collected data will be a supplement to data collected in the evaluation’s prior data collection (see OMB Approval No. 1290-0026).

2. Master site visit interview protocol

The implementation study site visit interview protocol will serve to collect information about the implementation of REO grantee programs to help people with justice system involvement reenter the community and connect them to education, training, and work. The information will also help contextualize impact study findings.

3. Participant focus group protocol

The participant focus group protocol will serve to learn more about participants’ reasons for enrolling, impressions of the program, and the extent to which the program has helped them prepare for employment. Although these focus groups might not be representative of all program participants, the discussions within will contribute to answering research questions about how participants experience the program and what elements they find most influential. The information will also help contextualize impact study findings.

4. Employer focus group protocol

The employer focus group protocol will serve to gather information from employers about how the grantees are meeting the needs of employers. These focus groups will include all employer partners and will allow for an exploration about variations in partnerships and services. The information will also help contextualize impact study findings.

The final implementation study and impact study reports will address all of the evaluation’s research questions and synthesize data across sources.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.


Whenever possible, the data collection efforts will use advanced technology to reduce burden on study participants and on staff at participating agencies. For example, the grantee survey will be available electronically so respondents can complete the questionnaire on their own schedule, in multiple sittings, and without needing to return any forms by mail. The semistructured interviews and focus groups will not require the use of technology, but—when respondents allow—they will be digitally recorded. This will minimize the time spent with respondents and enable study team members to take shorthand notes with the audio recording as backup.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item A.2 above.


The data being collected for the REO evaluation are not otherwise available from existing sources. Before administering the grantee survey or visiting sites, the study team will gather pertinent data from the grantee applications and publicly available sources. The study team will only request new information during the implementation study site visit interviews. Respondents will not be asked for the same information more than once.


5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


Employer partners will participate in a 60-minute focus group as part of the data collection effort for the implementation study. At each of the 28 sites, we expect an average of five employer partners to participate. Some of these employers might be from small businesses. To minimize burden on any small businesses that participate in the focus group, decisions about the timing and locations of the focus groups will consider employers’ schedules. As with all data collection activities, we will remind participants that their participation is completely voluntary.


6. Describe the consequence to federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


The evaluation represents an important opportunity for CEO to add to the growing body of knowledge about what works to serve participants with prior justice system involvement. Not collecting data on grantees through the grantee survey, implementation study site visit interviews, and focus groups will limit the study team’s ability to fully understand the context of the REO programs and place the evaluation’s findings into that context.


Without the grantee survey data, the study team would be less able to identify key similarities and differences in grantees’ organizational settings and characteristics.


Interviewing each of the different interview respondents (grant administrators, frontline staff, partner staff administrators, intermediary grant administrators, employer partners, and program participants) will yield information about each respondent’s distinctive experiences with the REO program. Without the implementation study data from the interview and focus group protocols, the study team would be less able to identify strategies that grantees use to provide services to people with justice system involvement. In addition, the study team would be less able to identify potential reasons for differences across grantees in impact estimates, if such differences emerge.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:


  • requiring respondents to report information to the agency more often than quarterly;


  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;


  • requiring respondents to submit more than an original and two copies of any document;


  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;


  • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;


  • requiring the use of statistical data classification that has not been reviewed and approved by OMB;


  • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or


  • requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentially to the extent permitted by law.


No special circumstances apply to this data collection.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years -- even if the collection-of-information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


1. Federal Register announcement

The 60-day notice 84 FR 8117 to solicit public comments was published in the Federal Register on March 6, 2019. One non-substantive comment was submitted.

2. Consultation outside of the agency

The study team is coordinating consultation on the research design and data needs, which involves discussions with experts. Table A.2 lists the people we consulted in preparing this submission to OMB.

Table A.2. People consulted for the REO evaluation

Organization

Individuals

Mathematica
P.O. Box 2393
Princeton, NJ 08543-2393
(609) 799-3535

Ms. Jeanne Bellotti
Director, Employment Research
(609) 275-2243


Dr. Jillian Berk
Deputy director
(202) 264-3449



Dr. Kevin Booker
Senior researcher
(
202) 484-4838



Dr. Karen Needels
Senior researcher
(609) 750-4043



Dr. Ankita Patnaik
Researcher
(
202) 838-3576



Dr. Jillian Stein
Survey researcher
(609) 716-4395

Social Policy Research Associates
1330 Broadway, Suite 1426
Oakland, CA 94612 
(510) 763-1499

Dr. Andrew Wiegand
President, CEO, and principal
(510) 763-1499, ext. 636




Mr. Christian Geckeler
Senior associate
(510) 788-2461


9. Explain any decision to provide any payments or gifts to respondents, other than remuneration of contractors or grantees.


No payments will be made to people completing the grantee survey, people who participate in semistructured interviews, or employers who participate in focus groups during the implementation study site visits. Program participants will receive a $20 gift card as an incentive to participate in the focus groups that will take place during the implementation study site visits.


10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


Personally identifiable information collected will be kept private to the extent permitted by law. The study team complies with DOL data security requirements by implementing security controls for processes that it routinely uses in projects that involve sensitive data. Further, the study is being conducted in accordance with all relevant regulations and requirements. The study team secures personally identifiable information and other sensitive project information and strictly controls access on a need-to-know basis. During data collection, the study team will take efforts to protect data and ensure that respondents understand the extent to which information can be kept private.


Implementation study site visit interviews and focus groups will be conducted in private areas, such as offices or conference rooms. At the start of each interview and focus group, the study team will read a statement to assure respondents of privacy and ask for their verbal consent to participate in the interview. This statement is available at the top of the master site visit protocol and the focus group protocols. REO participants will also be asked to sign a physical consent form, which the study team will read aloud, to ensure participants understand the form. Additionally, any data elements used for recruitment of focus group participants, such as name and telephone number, will be destroyed after completion of the focus groups. Interview transcripts will be destroyed at the end of the contract, and resultant reports from qualitative coding of the data will not identify respondents by name.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


The study team will not ask sensitive questions directly to study participants but will collect administrative justice data. As described previously, all sample members will receive assurances of privacy before they complete study enrollment forms.


12. Provide estimates of the hour burden of the collection of information. The statement should:


  • Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.


  • If this request for approval covers more than one form, provide separate hour burden estimates for each form.


  • Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.


Table A.3 includes assumptions about the annual number of respondents expected, the average number of responses per respondent, the average hours of burden per response, the annual burden hours estimated, the time value assumed for respondents, and the total annualized monetary burden hours for the implementation study grantee survey, implementation study site visit semistructured interview, and focus group protocols. All the activities covered by this request are annualized over three years. Here, we summarize the burden estimates rounded to the nearest whole number for each of the data collection activities:


  1. Grantee survey. As part of the implementation study, the study team will administer the grantee survey to about 45 grantees. We estimate each grantee respondent will spend about 20 minutes on the survey.

  2. Master site visit interview protocol. As part of the implementation study, which will be conducted in about 28 sites, the study team will conduct semistructured interviews with grant administrators, intermediary grant administrators, frontline staff, and partners to understand how the program implementation has been developed, managed, and delivered.

  3. Participant focus group protocol. As part of the implementation study, the study team will conduct a focus group at each of the 28 sites to gather information from participants. These focus groups will ask about participants’ reasons for enrolling, impressions of the program, and the extent to which the program has helped them prepare for employment. About 225 program participants will participate in the focus groups.

  4. Employer focus group protocol. As part of the implementation study, the study team will conduct a focus group at each of the 28 sites to gather information from employers. These focus groups will explore how the grantees are meeting the needs of employers. About 140 employer partners will participate in the focus groups.


Table A.3. Estimated Annualized Respondent Cost and Hour Burden


Activity

No. of Respondentsa


No. of Responses

per Respondent

Total Responses

Average Burden (Hours)

Total Burden (Hours)

Hourly

Wage Rateb

Annual monetized burden hours

Grantee survey

15

1

15

0.33

5

$36.13

$181

Semistructured interview: Grant administratorsc

19

1

19

3

57

$36.13

$2,059

Semistructured interview:

Frontline staffc

261

1

261

1.5

392

$21.82

$8,553

Semistructured interview: Partner staff administratorsc

47

1

47

1

47

$36.13

$1,698

Semistructured interview: Intermediary grant administratorsc

6

1

6

1.5

9

$36.13

$325

Participant focus groupsd

75

1

75

1

75

$7.25

$544

Employer focus groupse

47

1

47

1

47

$60.81

$2,858

Total

470




632


$16,218

Note: Numbers are rounded to the nearest whole number for all columns other than the “Time value” column.

aAll annual totals reflect a three-year clearance and study data collection period. Estimates are rounded to the nearest whole number.

bThe average hourly wages were obtained from the U.S. Bureau of Labor Statistics, National, State, Metropolitan, and Nonmetropolitan Area Occupational Employment and Wage Estimates, May 2020 (accessed at https://www.bls.gov/oes/current/oes_nat.htm on June 14, 2021). Estimates of administrators’ and managers’ wages are based on the average wages for “social and community service managers” ($36.13). Estimates of wages for frontline staff are based on the average wages for “miscellaneous community and social service specialists” ($21.82). Employer wages are estimated based on the average wages for “management occupations” ($60.81). Monetized estimates for participants were assumed to be the federal minimum wage of $7.25. Frontline staff row includes both REO program staff and partner staff.

cAssumes each visit will, on average, involve individual or group interviews with about 2 grant administrators, 5 partner staff administrators, and 28 frontline staff across grantees and partners. The team anticipates completing about 28 visits in total. The average burden time per response for the grant administrator interviews will be 3 hours. The average burden time per response for the frontline staff interviews will be 1.5 hours. The average burden time per response for the partner staff administrator interviews will be 1 hour. Additionally, the team anticipates conducting interviews with about 3 intermediary grant administrators in about 6 of the 28 visits. The intermediary grant administrator interviews will be about 1.5 hours, on average. For all types of staff, some meetings will be shorter, and some will be longer than the averages.

dAssumes each participant focus group will, on average, involve 8 people and will be conducted in about 28 sites. The average burden time per response will be 1 hour.

eAssumes each employer focus group will, on average, involve 5 people and will be conducted in about 28 sites. The average burden time per response will be 1 hour.



13. Provide an estimate of the total annual cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).


  • The cost estimate should be split into two components: (a) a total capital

and start up cost component (annualized over its expected useful life); and (b) a

total operation and maintenance and purchase of service component.

The estimates should take into account costs associated with generating,

maintaining, and disclosing or providing the information. Include descriptions of

methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.


  • If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.


  • Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


There will be no direct costs to respondents for the REO evaluation other than their time.


14. Provide estimates of the annualized cost to the Federal Government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 into a single table.

The total annualized cost to the federal government is $1,620,462. Costs result from the following categories:

  1. The estimated cost to the federal government for the contractor to carry out this study is $4,724,978 for all data collection, administrative records collection, and analysis and reporting. Annualized over three years of data collection, this comes to $1,574,993 per year.

  2. The annual cost DOL will bear for federal technical staff to oversee the contract is estimated to be $45,469. We expect the annual level of effort to perform these duties will require 200 hours for one Washington, DC–based Federal GS 14 Step 2 employee earning $65.29 per hour, and 200 hours for one Washington, DC–based Federal GS 15 Step 2 employee earning $76.80 per hour. (See Office of Personnel Management 2020 Annual Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2020/DCB.pdf.) To account for fringe benefits and other overhead costs, the agency has applied a multiplication factor of 1.6. Thus, [(200 hours x $65.29) + (200 hours x $76.80)] x 1.6 = $45,469.



15. Explain the reasons for any program changes or adjustments.


This is a new information collection.


16. For collections of information whose results will be published, outline plans for tabulations, and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


The evaluation plan includes an implementation and an impact study final report in the last year of the project.


Implementation study report. The study team will complete a report describing the findings from the implementation study. This report will document how sites were selected for the evaluation, as well as the characteristics of sites that participated. The report will also discuss the characteristics of the program participants, the flow of participants through the programs, the delivery of services, participation rates, and any challenges to serving participants.


Impact study final report. The study team also will complete the final report documenting impacts of access to REO grant services on participants’ outcomes. Likely outcomes will include employment and earnings, and criminal justice involvement. This report will also examine the effects for key subgroups and present an analysis of the association between program components and positive participant outcomes.


The timing of the data collection and the analysis and reporting is in flux due to the pandemic. However, we estimate that data collection will occur in 2021 and the analysis and reports will occur in 2022 and 2023.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The OMB approval number and expiration date will be displayed or cited on all forms completed as part of the data collection.


18. Explain each exception to the certification statement.


No exceptions are necessary for this information collection.


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS.


1 DOL received clearance to collect the grantee survey from 97 grantees through an initial request to OMB. This package requests approval to collect the grantee survey from additional grantees.

2 Semistructured interviews will be conducted with grant administrators, frontline staff, and partner staff administrators.

3 A subset of these REO grants was awarded to intermediary organizations that funded subgrantees for delivering REO services to participants, while other grants were awarded to community-based organizations that are providing REO services directly to participants. We use the term “site” to refer to either a direct community-based organization grantee or a subgrantee of an intermediary organization for a particular grant award.

4 A subset of these REO grants were awarded to intermediary organizations that funded subgrantees for delivering REO services to participants, while other grants were awarded to community-based organizations that are providing REO services directly to participants. We use the term “site” to refer to either a direct community-based organization grantee or a subgrantee of an intermediary organization for a particular grant award.

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMarch 6, 2007
AuthorTheda Kenney
File Modified0000-00-00
File Created2021-07-18

© 2024 OMB.report | Privacy Policy