Collection of Information OUP Evaluation PART B 110211

Collection of Information OUP Evaluation PART B 110211.docx

Evaluation of the Department of Housing and Urban Development's Office of University Partnerships' University Programs

OMB: 2528-0279

Document [docx]
Download: docx | pdf

Part B: Collection of Information Employing Statistical Methods


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g. establishments, State and local governmental units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The target populations for both the OUP web survey and telephone interviews are current and past recipients of grants under the four OUP grant programs mentioned in Part A. These recipients are all two- or four-year institutions of higher learning who received OUP grants between 2005 and 2008. There are 102 institutional campuses that received at least one OUP grant during that period, representing 91 institutions across the country. Abt proposes to select a sample of 67 grantee campuses from the pool of eligible institutions. This sample will be drawn using equal probability systematic sampling.


Exhibit B-1 presents this information for all four programs as a whole, as well as by OUP program type.


Exhibit B-1. OUP Grantees

2005–2008

All

ANNHIAC

HBCU

HSIAC

TCUP

Total grants

133

17

53

41

22

Total unique campuses

102

13

35

37

17

Total unique institutions

91

8

35

31

17


Based on the contractor’s experience recruiting grantees to participate in studies of this type, Abt anticipates making initial mail and telephone contact with 75 campuses, leading to 67 participating. The expectation is that about ninety percent of all grantees in the sample will agree to participate in the study, given that the time commitment to participate in the OUP web survey and telephone interview is relatively low. The main challenge Abt anticipates facing is finding one or more respondents at the grantee institution who are familiar with the grant, since there is likely to be staff turnover at the grantee institutions. For example, they may find grantee sites at which no one with knowledge of the grants received between 2005 and 2008 still works at the institution. For grantees that are not able to participate in the study, Abt will pick a substitute grantee within the same program type and the same sampling stratum.


This is a new collection.


2. Describe the procedures for the collection, including:

*Statistical methodology for stratification and sample selection;

* Estimation procedure;

* Degree of accuracy needed for the purpose described in the justification;

*Unusual problems requiring specialized sampling procedures; and

*Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


As stated earlier, the evaluation sampling frame will be the 102 institutional campuses that received at least one OUP grant between 2005 and 2008. These 102 campuses represent 91 institutions across the country. Abt will conduct its analysis at the campus level rather than the institution level because the campuses typically function independently. For example, although the University of Alaska–Fairbanks, Interior-Aleutians Campus and the University of Alaska–Fairbanks, Kuskokwim Campus are both part of the University of Alaska system, the grants each campus received were unrelated and completely independent. In total, the 102 campuses received 133 grants during the target time period. Exhibit B-1 above presents this information for all four programs as a whole, as well as by OUP program type.


From the sample, Abt will collect information on outputs and outcomes related to two randomly selected activities for each of the sample grantees. Collecting data on all the activities associated with a grantee would be time consuming and a response burden to the grantees. To simplify data collection, it was decided to randomly select activities for each grantee from their activities listed in the OUP database (the database contains a record of the activities each grantee proposed in their initial grant application). This approach fits with the study’s exploratory nature and its multiple goals of documenting outcomes achieved, documenting outcome data available, and understanding challenges faced in implementing grant activities. To counterbalance the narrowing of the study’s focus, Abt will seek to understand how the randomly selected activities relate to previously funded activities. If activities are a continuation of previously funded OUP activities, Abt will record all outcome data available and report on any long-term trends associated with larger projects that encompass activities funded by OUP between 2005 and 2008. For example, if four housing units built with a 2006 HBCU grant are part of a larger affordable housing initiative that was started in 2002, Abt will capture the outcomes for the larger initiative, in addition to the outcomes achieved by the specific 2006 OUP grant.


Additionally, if either or both of the selected activities were one of many activities included within a larger multi-purpose center or university facility, the telephone interview will capture data on all activities related to the center or facility. The evaluation team decided to take this approach because the pre-test interviews showed that grantees perceive their “center” or “facility” efforts as a complete project. To ask about one activity and not the rest of a center’s activities could be frustrating or confusing to the grantee. Additionally, a more comprehensive approach provides the evaluators with a more complete understanding of the interconnectedness of grant activities.


To accurately capture independent activities (i.e., those not included as part of a multi-purpose center or campus facility) as well as co-located activities, we have developed two versions of the telephone instrument, Format A and Format B. Format A will be used to gather information about the independent activities. Format B will be used when one of a grantee’s randomly selected activities was part of a multi-purpose center or campus facility. Format B is designed to provide a coherent way for grantees to describe all of the interconnected activities they implemented as part of a multi-purpose center or campus facility. Abt will use grantees’ responses to web survey questions 2.1–2.3 (Appendix 4) to determine which question format will be used for each activity.


Statistical Methodology for Stratification and Sample Selection

Selection of Grantees

Given available resources, Abt plans to conduct telephone interviews with 67 campuses that received OUP funds between 2005 and 2008. Exhibit B-2 illustrates the expected distribution of grantees among OUP programs.


Exhibit B-2. Sampling Design

2005–2008

All

ANNHIAC

HBCU

HSIAC

TCUP

Total unique campuses

102

13

35

37

17

Sample: web survey and telephone interview

67

9

23

24

11



Abt will select a stratified simple random sample of 67 campuses from a population of 102 campuses. Specifically, their approach will be to stratify by program type (ANNHIAC, HBCU, HSIAC, and TCUP) and then allocate the sample to each stratum in proportion to the number of grantees in that stratum. Once this is completed, an equal probability systematic sample will be selected in each stratum from the list of grantees after sorting the list by number of planned activities and earliest grant year (Exhibit B-3). This sampling approach will help to ensure that grantee characteristics expected to be key to our analysis are represented.


Exhibit B-3. Selection Criteria for the Telephone Interviews

Program

OUP’s four university programs—ANNHIAC, HBCU, HSIAC, TCUP

Number of Planned Activities

To ensure that varying grant complexities are represented (defined as the number of planned grant activities), the evaluation will ensure that grants with a range of activities are included.

Earliest Grant Year

To examine whether projects achieve long-term outcomes, the evaluation will ensure that campuses with grants from 2005 are included.



Abt anticipates drawing the telephone interview sample in July or August of 2011.


Selection of Activities

The evaluation’s Activity Sampling Plan will be to sample two non-trivial activities from each grantee with one activity each from two different types or categories of activity.1 Since there are a large number of activities implemented by OUP grantees and some of these are related in nature, the activities will be grouped into six broad categories (Housing: New construction and rehabilitation, Public Facilities: New construction and rehabilitation, Training and Education, Planning and TA, Housing Assistance, Business Development). Additionally, the number of activities implemented by each grantee varies widely. Some grantees may have implemented activities in all six of the categories and some may have only implemented activities in one category. The objective of the sampling plan is to ensure that the total sample of activities across the 67 grantees is representative of the population of activities associated with the 67 grantees.


In order to ensure a representative random sample of activities, the Abt team will implement the following steps when selecting the sample:


  1. Identify all grantees with only one or two types of activities. Select these activity types with certainty. From grantees with only one activity type, select two activities from that activity type. For grantees with two activity types, select one activity from each activity type.

  2. Look at the distribution of grantees with more than two activity types by the number of activity types. If we assume that there are “n” grantees with more than two activity types and we want a sample of two activity types from each grantee, then the evaluation will need a total sample of 2 x n activity types from these grantees.

  3. Allocate the total sample of activities to each type in proportion to the total number of grantees under that activity type. This means there will be a larger sample of an activity type associated with a larger number of grantees and a smaller sample of an activity type associated with a smaller number of grantees.

  4. Start with the activity type with the smallest sample size and select that number of grantees with this activity type at random. This identifies grantees from which this activity type will be selected. Now select grantees at random who have the activity type with the second smallest sample size. At the end of the second round of selection, identify those grantees that have two activity types selected. These grantees will not be considered for further sample selection as they already have two types of activities selected. Continue selecting grantees for the third type of activity from the remaining grantees and so on. At the end of each selection, grantees in the sample with two types of activities selected will not be considered for further selection of activity types. The selection will be done from the remaining grantees.


At the end of the selection process the evaluation will have a sample with proportional representation for each type of activity, with two activity types selected for each grantee.


The final consideration in sample selection will be the amount of OUP grant resources that were dedicated to the randomly selected activities. To ensure that the evaluation focuses on non-trivial activities, the Abt team will review each grantee’s response to web survey question 2.3, “total activity cost” and “amount of OUP funds used for this activity”; this review will be completed prior to sending out the “Follow-up Email Prior to Telephone Interview” email (Appendix 3.3) which alerts the grantee to the specific activities that will be the focus of the telephone interview. If less than 20 percent of an OUP grant was used to implement one of the randomly selected activities, the activity will be considered trivial and the Abt team will randomly select a different activity for the grantee. If possible, the evaluators will select an activity from the same activity type. If no other activities were implemented within the same type, an activity will be selected at random from all of the remaining activities implemented by the grantee. This final substitution step will diminish the randomness of the sample slightly but this loss is offset by the assurance that the evaluation is focused on activities where sizable OUP investments actually occurred.


As a precautionary step, the Abt team will monitor the congruency between the activity information from the OUP database that will be used to select activities for the phone interviews and the actual activities grantees report undertaking on their web surveys. If the team finds that grantees are reporting activities on the web survey that were not in the OUP database, sampling procedures will be adjusted to ensure that the activity sample remains representative.


3. Describe the methods used to maximize response rates and to deal with non-response. The accuracy and reliability of the information collected must be shown to be adequate for the intended uses. For collections based on sampling, a special justification must be provided if they will not yield “reliable” data that can be generalized to the universe studied.


Grantees sampled for the OUP web survey and telephone interviews will be notified by OUP via a mailing that they have been selected to be interviewed about activities funded through OUP grants (see Appendix 2). This letter will explain the benefits and purposes of the study, describing what will be required of the grantee, and encouraging the grantee to participate. A key part of recruiting grantees to participate in the study will be providing a clear explanation of what staff will be required to do after receiving the letter.


Abt will then begin contacting the grantees by telephone to explain the study in more detail, answer questions, and secure their participation or contact information for an individual better positioned to participate in the study. Grantees will be encouraged to recommend the individual(s) most knowledgeable about the outcomes accomplished and the challenges faced during grant implementation.


Depending on the structure of each institution, the survey and interview respondents may differ from institution to institution. However, the participants likely will be those who work directly with grant implementation or performance measurement at the grantee institution. Once Abt staff has identified the correct individual or individuals, Abt will set up one group interview or a series of individual interviews, with each subsequent interview after the initial interview only focusing on the questions that the previous interviewee was unable to answer. In the event that a single point of contact was identified and that individual is unable to respond to all interview questions, the telephone interviewer will request contact information for other individuals who might be able to provide the missing information. Abt staff will contact these additional individuals after the interview to collect the missing data.


The main response rate challenge Abt anticipates facing is finding one or more respondents at the grantee institution who are familiar with the grant, since there is likely to be staff turnover at the grantee institutions. For example, they may find grantee sites at which no one with knowledge of the grants received between 2005 and 2008 still works at the institution. For these grantees without a knowledgeable contact person, Abt will pick a substitute/replacement grantee within the same program type and the same sampling stratum.


Because the objective of the survey is mainly to provide descriptive analysis of the projects rather than produce estimates, the greatest concern is ensuring that the sample, both of grantees and of their activities, is sufficiently diverse to cover all activity types. Therefore, what is essential is that the sample is representative of important subgroups in the population. The recommended stratified sampling approach outlined above provides for the needed representation.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.


The OUP data collection process was pre-tested with one grantee from each of the four programs. These pre-tests allowed Abt to test the questions on grantees as well as assess the extent to which respondents would be able to report on measureable outcomes. The testing also helped Abt to clarify question wording and to refine the timing estimates used in determining respondent burden.


5. Provide the name and telephone number of individuals consulted on the statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The individuals in Exhibit B-4 assisted HUD with the statistical design of this data collection on OUP Grant Outcomes. HUD has contracted with Abt Associates Inc. to conduct the data collection and analysis.


Exhibit B-4. Individuals Consulted on the Study Design

Name

Telephone Number

Affiliation

Role in Study

Judson James

202-402-5707

HUD

HUD GTR

Ndeye Jackson

202-402-5737

HUD

HUD GTM

Barbara Haley

202-402-5708

HUD

HUD GTM

David Chase

202-402-5733

HUD

Technical Reviewer

Amy Minzner

617-349-2314

Abt Associates Inc.

Project Director

K.P. Srinath

301-634-1836

Abt Associates Inc.

Sampling Statistician

Jill Khadduri

301-634-1745

Abt Associates Inc.

Technical Advisor

Jennifer Turnham

410-382-4837

Abt Associates Inc.

Technical Advisor

Priscilla Prunella

240-333-0243

Econometrica Inc.

Technical Advisor

Armand Carriere

774-262-5331

Consultant

Technical Advisor

George Galster

313-577-9084

Consultant

Technical Advisor

Dan VanOtten

503-835-8023

ACKCO

Technical Advisor

Stephen Whitlow

301-347-5503

Abt Associates Inc.

Task Leader of Data Collection



Inquiries regarding the statistical aspects of the study’s planned analysis should be directed to:


Amy Minzner Project Director Abt Associates Inc. Telephone: 617-349-2314


1 A non-trivial activity is one that received at least 20percent of a grantee’s OUP funds in a given grant cycle.

Abt Associates Inc. Part B: Collection of Information Employing Statistical Methods 6

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorH06852
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy