Revised Supporting Statement

Revised Supporting Statement.doc

Developmental Disabilities Program Independent Evaluation Project

OMB: 0970-0372

Document [doc]
Download: doc | pdf

THE SUPPORTING STATEMENT



Specific Instructions

Please do not remove or alter the headings below


A. Justification

  1. Circumstances Making the Collection of Information Necessary

The Administration on Developmental Disabilities (ADD) oversees the implementation of the Developmental Disabilities Assistance and Bill of Rights Act of 2000 (DD Act) (P.L. 106-402) (42 USC 15062). The purpose of the DD Act is to assure that individuals with developmental disabilities and their families participate in the design of and have access to needed community services, individualized supports, and other forms of assistance that promote self-determination, independence, productivity, and integration and inclusion in all facets of community life.


As defined in the DD Act, the term “developmental disabilities” means a severe, chronic disability of an individual that is attributable to a mental or physical impairment or combination of mental and physical impairments that is manifested before the individual attains age 22 and is likely to continue indefinitely. Developmental disabilities result in substantial limitations in three or more of the following functional areas: self-care, receptive and expressive language, learning, mobility, self-direction, capacity for independent living, and capacity for economic self-sufficiency. It is estimated that four million people in America have developmental disabilities.


The DD Act authorizes appropriations for three programs in the States to achieve the purposes of the Act:


  • State Developmental Disabilities Councils;

  • State Protection and Advocacy Systems to Protect the Rights of Individuals with Developmental Disabilities; and

  • The National Network of University Centers for Excellence in Developmental Disabilities, Education, Research, and Service.


The current information collection, the Developmental Disabilities Programs Independent Evaluation, will examine the impact of these programs.


There are several legal and administrative requirements that as a combined constellation necessitate the collection. These include initiatives over the past several Administrations to promote accountability of federally funded programs.


The DD Act requires a system of accountability for the DD Act programs. Specifically Section 105 requires that submit to the President, Congress, and the National Council on Disability a report that describes the goals and outcomes of programs.


The Government Performance and Results Act of 1993 (GPRA) provides another basis for conducting the information collection. Among the purposes of GPRA is to: improve the confidence of the American people in the capability of the Federal Government, by systematically holding Federal agencies accountable for achieving program results; improve Federal program effectiveness and public accountability by promoting a new focus on results, service quality, and customer satisfaction; help Federal managers improve service delivery, by requiring that they plan for meeting program objectives and by providing them with information about program results and service quality; and to improve internal management of the Federal Government.


Executive Order 13450 Improving Government Program Performance calls for improving the effectiveness and efficiency of the Federal Government and promoting greater accountability of that Government to the American people.


The current Administration has outlined in the document “Building a High Performing Government”. It calls for improving results and outcomes for Federal Government programs while reducing waste and inefficiency. It also calls for program evaluations.


  1. Purpose and Use of the Information Collection

The purpose of the Developmental Disabilities Program Independent Evaluation (DDPIE) Project is to examine through rigorous and comprehensive performance-based research procedures the targeted impact of grantee activities funded under the Developmental Disabilities Assistance and Bill of Rights Act of 2000 (DD Act). The DDPIE is divided into two phases. The first phase carried out from October 2005 – September 2008 involved the development of valid and reliable measurement matrices for determining program impact and the implementation of a pilot study. The second phase will include two stages: (1) obtaining OMB approval for the evaluation tools (e.g., data collection instruments) developed during Phase I; and (2) full implementation of the evaluation using the measurement matrices developed during Phase I and finalization of the performance standards. ADD is seeking to fund Phase II of DDPIE as a follow on to Phase I of the project.


It is not the purpose of the independent evaluation to analyze ADD’s current measurement system that is used by grantees to report on their activities. Instead, the purpose is to have an objective, outside contractor use a measurement system and related evaluation tools designed specifically to determine the impact of the national DD Network programs on individuals with developmental disabilities, state service systems, and on the capacity of service providers and a wide-range of professionals to reach, treat, or assist individuals with developmental disabilities to become more independent and to participate in and contribute to community life along side of other members of the community.


The information collected through DDPIE will be used to provide in-depth performance information to several stakeholders:


  • Members of Congress

  • OMB and the Administration

  • Grantees

  • Individuals with developmental disabilities

  • Family members

  • Advocates

  • Other federal agencies


The Administration on Developmental Disabilities will be able to use the information to make program improvements.


  1. Use of Improved Information Technology and Burden Reduction

The components of the proposed information collection will use technology to reduce burden. This includes the use of electronic surveys and teleconferences.

  1. Efforts to Identify Duplication and Use of Similar Information

ADD has a number of mechanisms for monitoring DD Network programs and ensuring that they are complying with the Developmental Disabilities Assistance and Bill of Rights Act of 2000 (DD Act). Moreover, there is a number of strategic planning and accountability requirements in the DD Act itself that programs must comply with through reports to ADD. The result is that DD Network programs are already required to provide ADD with a considerable amount of data.


A pilot study was conducted that examined whether existing data can be used for the evaluation. All recent reports submitted to ADD by pilot study programs were reviewed to determine whether data from these reports would be able to answer the questions in the pilot study questionnaires. Through a crosswalk, the approximate location of data in several reports to ADD were located that might be able to answer the questions. Then, the existing data was incorporated from those reports into questionnaire binders. For each question for which existing data had been located, evaluators determined whether the existing data was able to answer the question.


It was found that the existing data was incomplete, out of date, not related to the dates of interest or not specifically related to the indicator and question. It was also noted that there was considerable inconsistency in definitions used by each program (e.g., in the NIRS data), and differences in formatting and contents of each report. Although much of the data was useful as background, none was able to specifically answer the questions in the questionnaires, which were based directly on the benchmarks and indicators that had been developed. Moreover, because of the inconsistency in definitions and data collection methodology, data would not be considered reliable enough to meet OMB requirements and it would not be possible to combine program data for roll-up to the national level.


  1. Impact on Small Businesses or Other Small Entities

Some of the programs funded by ADD described under (1) are small in comparison to others. For example, under the DD Council program, some of the programs are considered ‘minimum allotment’. These Councils receive the smallest amount of funding. Because the funding formula for the DD Councils is based partly on population, typically, ‘minimum allotment’ DD Councils are those in States and territories with a small population.

The full-scale independent evaluation, by necessity, will produce a certain amount of burden to all of programs included in the study sample. Programs will be asked to assemble individuals to be interviewed and to collect specific materials to send to the evaluator. The questionnaires were reduced considerably so interviews will be shorter and programs will be asked to collect fewer materials.

  1. Consequences of Collecting the Information Less Frequently

This is a one time project.



  1. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances.

  1. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

Please see attachment A for information about the first Federal Register Notice and the response to comments received from this notice.

ADD made significant efforts to consult outside the Agency. These efforts are summarized below. A full listing of individuals who provided feedback for each of the activities described below is provided in Attachment B.

    • Independent contractor: ADD solicited for an outside contractor, Westat, to conduct the independent evaluation. Westat took three years to develop the information collection instruments for the independent evaluation.

    • Advisory Panel: As part of it’s work, Westat established an Advisory Panel that included people with developmental disabilities, family members, other consumers, advocates, researchers, representatives from the DD network programs and policy specialists.

    • Working groups: Westat organized and conducted P&A, DD Council, UCEDD and Collaboration Working Group meetings in person and by telephone and web cast throughout the spring, summer, and fall of 2006 to consult with experts in developing the information collection instruments.

    • Feedback from Programs: Westat provided ADD programs in each state with opportunities to provide feedback and comments on the evaluation and draft documents in 2007.

    • Pilot Study: Westat conducted a Pilot Study in 2008. The objectives of the pilot study were to: (1) inform the revision or elimination of some the benchmarks and indicators; (2) test data collection instruments for measuring the indicators; (3) inform the further development of performance standards; (4) determine the usefulness of existing data in reports to ADD; and (5) test the logistics for a full-scale independent evaluation. There is no OMB clearance for the pilot study because information was collected from less than 9 people.

    • Validation Panels: Westat convened Validation Panels in July 2008. Advisory Panel members made recommendations to Westat on the membership of Validation Panels within the following categories: was a person with a developmental disability or family member; was an advocate; and had a familiarity with research and policy. In addition, panel members needed to have an understanding of consumer needs; have an understanding of the purpose of the programs; have an appreciation for outcomes; be at least somewhat involved in the DD Network system; and have a proven track record of self-advocacy (e.g., DD Council members; self-advocates outside the programs). Westat also obtained a mix of urban and rural representation (with some thought to geographic representation) and a mix of senior and junior program staff. Each person reviewed the instruments and provided feedback.

  1. Explanation of Any Payment or Gift to Respondents

Respondents will be compensated for any direct costs incurred to participate in interviews (e.g., travel, child care, etc.). They will not be paid otherwise for participation.

  1. Assurance of Confidentiality Provided to Respondents

The proposed information collection instruments have received IRB approval.


The contractor implementing this information collection activity, Westat, is firmly committed to the principle that the confidentiality of individual data obtained through Westat interviews must be protected. This principle holds whether or not any specific guarantee of confidentiality was given at time of interview (or self-response), or whether or not there are specific contractual obligations to the client. When guarantees have been given or contractual obligations regarding confidentiality have been entered into, they may impose additional requirements which are to be adhered to strictly.



Below are the procedures Westat following for maintaining confidentiality:


1. All those with access to confidential data collected through interview (e.g., transcribers) shall sign an assurance of confidentiality. This assurance may be superseded by another assurance for a particular project.

2. Transcribers (and others with access to confidential data) shall keep completely confidential the names of respondents, all information or opinions collected in the course of interviews, and any information about respondents learned incidentally during the transcribing process. Transcribers shall exercise reasonable caution to prevent access by others to interview data in their possession.

3. Unless specifically instructed otherwise for a particular project, a transcriber, upon encountering a respondent or information pertaining to a respondent that s/he knows personally, shall immediately terminate the activity and contact her/his supervisor for instructions.

Before an interview starts, respondents will be asked to sign a consent form. It will also be conveyed that information provided during the interview will not be provided to ADD or anyone else. Respondents will also be told that they do not have to answer any questions they don’t want to answer, and they may also leave the interview at any time.


  1. Justification for Sensitive Questions

No questions of a sensitive nature will be asked.

  1. Estimates of Annualized Burden Hours and Costs

The table below shows the total annualized burden hours and costs estimated for this information collection activity.

Estimate of Total Annualized Burden Hours and Costs for Proposed Information Collection Activities

Total Number of Respondents

Number of

Responses per Respondent

Total Average

Burden Hours per Response

Total Burden

Hours

Total Cost

1,380

1

148.75

4,075

$122,250



These estimates are based on two types of activities for the proposed information collection instruments: (1) participation in interviews and completion of a self-administered questionnaire; and (2) preparation activities. The estimates for these two types of activities are described in the tables below. The first table provides estimates for participation in interviews and completion of self-administered questionnaires. The second and third tables provide estimates for the burden hours and costs associated with participating in activities that will support implementation of the proposed information collection instruments. This includes time to collect, organize, and submit advanced materials and materials collected on site; identify key informants, obtain consent; prepare agenda; schedule interviews; make logistical arrangements; and participate in an exit interview. These activities are included as part of implementing the self-administered form for the DDCs, P&As, and UCEDDs; however a breakdown is provided on this particular aspect of the information collection activity for purposes of clarity in how the estimates were derived for the self-administered form. Finally, given the variability in the hourly rates for the different types of participants, an average hourly rate of $30 was used for all the cost estimates.

Table 1: Estimate of Annualized Burden Hours and Costs for the Proposed Information Collection Instruments

Instrument

Number of

Respondents


Number of

Responses

per

Respondent


Average

Burden

Hours

per

Response

Total

Burden

Hours


Total Cost

DD Council: Executive

Director Interview

20

1

4

80

$2,400

DD Council: Interview

with Council

Chair/Council Members

60

1

.75

45

$1,350

DD Council: Group

Interview with

Policymakers,

Collaborators, and

Grantees

160

1

2

320

$9,600

DD Council: Group

Interview with

Recipients of

Self-Advocacy and

Leadership Education

and Training

100

1

.75

75

$2,250

DD Council: Group

Interview with

Recipients of

Education and Training

to Improve Community

Capacity

100

1

.75

75

$2,250

DD Council:

Self-administered Form

20

1

41.5

830

$24,900

P&A: Executive

Director Interview

20

1

4

80

$2,400

P&A: Staff Interview

60

1

.75

45

$1,350

P&A: Board of

Directors

(Commissioners)-Chair

and Members

60

1

.75

45

$1,350

P&A: Group Interview

with Policymakers and

Collaborators

160

1

2

320

$9,600

P&A: Interview with

Recipient of Community

Education

100

1

.75

75

$2,250

P&A: Interview with

Clients

100

1

.75

75

$2,250

P&A: Self-administered

Form

20

1

41.5

830

$24,900

UCEDD: Interview with

Director

20

1

4

80

$2,400

UCEDD: Telephone

Interview with Current

and Graduated Students

100

1

.75

75

$2,250

UCEDD: Interview with

the Consumer Advisory

Committee

60

1

.75

45

$1,350

UCEDD: Interview with

Peer Researchers and

Colleagues

100

1

.75

75

$2,250

UCEDD: Interview with

Recipients of

Community Services or

Members of

Organizations/Agencies

that are Trained to

Provide Community

Services

100

1

.75

75

$2,250

UCEDD:

Self-administered Form

20

1

41.5

830

$24,900

TOTAL

1,380


148.75

4,075

$122,250



Table 2. Summary: Estimate of Total Burden Hours and Costs for Activities to Support Administration of Proposed Information Collection Instruments

Program type

Number of Respondents

Number of

Responses

per

Respondent


Average

Burden

Hours

per

Response*

Total

Burden

Hours


Total Cost

P&A

20

1

33.5

670

$20,100

DD Council

20

1

33.5

670

$20,100

UCEDD

20

1

33.5

670

$20,100

*includes time to collect, organize, and submit advance materials and materials collected on site; identify key informants, obtain consent; prepare agenda; schedule interviews; make logistical arrangements; and participate in an exit interview


The following table shows the breakdown for the estimates in table 2.


Table 2a. Breakdown of estimate of additional burden for each task in the DDPIE Phase 2—full-scale evaluation, by program type


Task—DD Council

Number of respondents

Number of responses per respondent

Average burden hours per respondent


Total burden hours

Prepare agenda (including emails and phone calls with contractor staff to schedule a two-day visit, understand selection criteria for interviewees, and identify a topic for the group interview).

20

1

5

100

Track down documents on checklist of materials (including compiling, photocopying, and sending requested documents to contractor).

20

1

10

200

Select interviewees and make arrangements for their participation

20

1

10

200

Review questionnaires prior to visit.

20

1

5

100

Set up video-conference or phone conferences

20

1

3.5

70

Subtotal



33.5

670

Task—P&A

Number of respondents

Number of responses per respondent

Average burden hours per response

Total burden hours

Prepare agenda (including emails and phone calls with contractor staff to schedule a two-day visit, understand selection criteria for interviewees, and identify a topic for the group interview).

20

1

5

100

Track down documents on checklist of materials (including compiling, photocopying, and sending requested documents to contractor).

20

1

10

200

Select interviewees and make arrangements for their participation

20

1

10

200

Review questionnaires prior to visit.

20

1

5

100

Set up video-conference or phone conferences

20

1

3.5

70

Subtotal



33.5

670

Task—UCEDD

Number of respondents

Number of responses per respondent

Average burden hours per respondent

Total burden hours

Prepare agenda (including emails and phone calls with contractor staff to schedule a two-day visit, understand selection criteria for interviewees, and identify a topic for the group interview).

20

1

5

100

Track down documents on checklist of materials (including compiling, photocopying, and sending requested documents to contractor).

20

1

10

200

Select interviewees and make arrangements for their participation

20

1

10

200

Review questionnaires prior to visit.

20

1

5

100

Set up video-conference or phone conferences

20

1

3.5

70

Subtotal

5

33.5

670

Total of additional estimated total program burden for all three programs



100.5

2,010



  1. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

There will be no cost to the respondents. The cost will be incurred by the Federal government.

  1. Annualized Cost to the Federal Government

Total cost is $1,500,000 for a two year time period or $750,000 per year. For each program, the cost is $500,000 total for a two year time period or $250,000 per year.

  1. Explanation for Program Changes or Adjustments

This is a new project.

  1. Plans for Tabulation and Publication and Project Time Schedule

Task 2: Implement Independent Evaluation Study

Subtask 2.1

Develop a Plan for Implementing the Study

1 electronic copy

Within 2 months of start date

Subtask 2.2

Identify Study Sample and Contact Participants

1 electronic copy

Within 3 months of start date

Subtask 2.3

Use Evaluation Tools to Collect Data


Within 6 months of start date

Subtask 2.4

Train Research Staff on the Use of Evaluation Tools


Within 4 months of start date

Task 3: Finalize Performance Standards for the National DD Network Programs and Include in Final Package to ADD

Subtask 3.1

Further Develop Performance Standards by Building Upon Work Conducted in Phase I

1 electronic copy

24 months

Subtask 3.2

Finalize the Performance Standards for the Measurement Matrices

1 electronic copy

24 months

Subtask 3.3

Organize Performance Standards into Measurement Matrices

1 electronic copy

24 months

Subtask 3.4

Incorporate Performance Standards into the Measurement Matrices and Submit as One Package for Use by ADD

2 hard copies, 1 electronic copy

24 months

Task 4: Synthesize Findings and Develop Recommendations

Subtask 4.1

Synthesize Findings

1 electronic copy

24 months

Subtask 4.2

Develop Recommendations

1 electronic copy

24 months

Task 5: Progress Reports

Subtask 5.1

Periodic Progress Reports to ADD Grantees

Electronic when requested

As requested

Subtask 5.2

Quarterly technical progress reports

1 electronic copy

Quarterly


Subtask 5.3

Develop and submit to PO final report with recommendations

2 hard copies, 1 electronic copy

24 months



  1. Reason(s) Display of OMB Expiration Date is Inappropriate

Not applicable

  1. Exceptions to Certification for Paperwork Reduction Act Submissions

Not applicable

B. Statistical Methods (used for collection of information employing statistical methods)

  1. Respondent Universe and Sampling Methods

Programs will be selected using stratified random sampling procedures. The only stratification variable will be four geographic region designated by the U.S. Census: Northeast, Midwest, South, and West. Three territories will be included – District of Columbia, Puerto Rico, and Guam – because they each have all three DD Network programs. District of Columbia and Puerto Rico will be included in Region 3 (South). Guam is included in Region 4 (West).


Once programs have been identified, the table below shows the individuals that will be interviewed.


Respondent

Potential Respondent Universe

Sample Number

DD Council: Executive Director Interview

55

20

DD Council: Interview with Council Chair/Council Members

440

60

DD Council: Group Interview with Policymakers, Collaborators, and Grantees

5,500

160

DD Council: Group Interview with Recipients of

Self-Advocacy and Leadership Education and Training

2,750

100

DD Council: Group Interview with Recipients of

Education and Training to Improve Community

Capacity

2,750

100

DD Council:

Self-administered Form

55

20

P&A: Executive

Director Interview

57

20

P&A: Staff Interview

855

60

P&A: Board of Directors (Commissioners)-Chair and Members

1,140

60

P&A: Group Interview with Policymakers and

Collaborators

5,700

160

P&A: Interview with Recipient of Community

Education

5,700

100

P&A: Interview with Clients

5,700

100

P&A: Self-administered Form

57

20

UCEDD: Interview with Director

68

20

UCEDD: Telephone Interview with Current

and Graduated Students

3,400

100

UCEDD: Interview with the Consumer Advisory

Committee

1,360

60

UCEDD: Interview with Peer Researchers and

Colleagues

3,400

100

UCEDD: Interview with Recipients of Community Services or Members of

Organizations/Agencies that are Trained to Provide Community Services

500,000

100

UCEDD: Self-administered Form

68

20



  1. Procedures for the Collection of Information

See response to question 1 for information about sampling procedures.


Westat recommends a design and methodology for the full-scale evaluation that is similar to the one used in the pilot study. This design is composed of the collection of both qualitative and quantitative data. Like the pilot study, qualitative data should be collected in key informant interviews of program staff, Council and Board members, Consumer Advisory Committee members, and others who have received services or participated in activities supported by Developmental Disabilities (DD) Network programs. Westat also recommends that quantitative data be collected with a new data collection instrument (a self-administered form for each program).


Logistically, collection of qualitative data was feasible to implement in the pilot study and enabled Westat evaluators to collect the type of information needed to measure the indicators drafted at the time of the pilot study. Because there was some difficulty in collecting some of the quantitative data during an in-person interview, Westat developed an additional form to collect such data. All forms contain a specific “reporting period” that will be designated by Westat. Those completing the form will be asked only to consider data that apply to the reporting period. In that way, data from all states and programs will be able to be rolled up to the national level.


Since the self-administered questionnaire was not tested in the pilot study, we recommend that Westat be given the opportunity to conduct cognitive testing and pre-testing prior to field implementation.


  1. Methods to Maximize Response Rates and Deal with Nonresponse

Westat will use multiple methods to increase response rate, including in-person interviews, electronic surveys, telephone interviews, follow-up calls, and follow-up on site. As a result, ADD expects a very high response rate. If there are non-responses, Westat will follow-up with the respondents to determine reasons for the non-response and provide assistance.

  1. Test of Procedures or Methods to be Undertaken

A pilot study was conducted for this information collection activity. The objectives of the pilot study were to: (1) inform the revision or elimination of some the benchmarks and indicators; (2) test data collection instruments for measuring the indicators; (3) inform the further development of performance standards; (4) determine the usefulness of existing data in reports to ADD; and (5) test the logistics for a full-scale independent evaluation.


Programs were selected for the pilot study by excluding those programs that met ADD’s exclusion criteria (e.g., Executive Director was a member of the Advisory Panel or Working Group; program had no or a new Executive Director), stratifying according to ADD’s inclusion criteria,1 and randomly selecting programs within each stratum. Selected programs were notified by a letter from the Commissioner of ADD, and Westat followed up with an email. Ten DD Network programs in seven states participated in the pilot study (Exhibit 2).


Exhibit 2. Programs that participated in the pilot study


State Developmental Disabilities Councils


Protection and Advocacy Systems


University Centers of Excellence

New Hampshire DD Council (collaboration only)

New Hampshire Protection and Advocacy Agency

Disabilities Rights Center

University of New Hampshire
Institute on Disability

Ohio Developmental
Disabilities Council

Ohio Legal Rights Service

University of Iowa - Center for Disabilities and
Development

Wyoming Council on Developmental Disabilities

Disability Law Center of Alaska

University of California, Los Angeles - Tarjan Center


New Mexico State Council on Developmental Disabilities




Preparation for the pilot study consisted of the development of all data collection instruments, obtaining IRB approval,2 and staff training. Questionnaires were developed to be consistent with the benchmarks and indicators that had been developed as of December 24, 2007. Westat had been advised by the Advisory Panel to be over- instead of under-inclusive to make sure that indicators were not eliminated prematurely. Thus, the number of indicators that were included as of December 24, 2007 was 78 for DD Councils, 97 for P&As, 118 for UCEDDs, and 18 for collaboration.


Questionnaires were developed for each key informant (interviewee) (e.g., people with developmental disabilities, family members, policy makers, collaborators, researchers/colleagues, current and former students, Consumer Advisory Committee members, members of boards of directors or commissions, and recipients of education and training on developmental disabilities). All three programs had recommended that the evaluation use the data already being submitted to ADD instead of collecting new data. Thus, a critical aspect of questionnaire development was conducting a crosswalk between indicators and information provided by ADD programs in reports submitted to ADD and determining whether data from these reports could be used in the evaluation.


Data collection, which took place between January 28, 2008 and April 11, 2008, consisted primarily of semi-structured in-person individual and group key informant interviews and individual telephone interviews. In-person interviews were conducted during a 2-day program visit. Telephone interviews were scheduled either before or after each site visit depending on the availability of persons to be interviewed.


After programs were notified by ADD about their selection for the pilot study and sent a followup email by Westat, Westat worked with the Executive Director and staff of each selected program to schedule dates and times for the visit and interviews, identify key informants, and develop an agenda for the site visit. A followup letter was sent to each program to confirm the dates of the visit, describe the visit further, and provide a copy of all questionnaires.


Questionnaires were administered by trained Westat staff to program personnel and program target audiences. Permission was requested from respondents to audio record all interviews. Recordings were transcribed after the site visit. In addition to data collection through interview, Westat also obtained a variety of materials from each of the programs participating in the pilot study.


Conduct of the pilot study resulted in three types of findings: (1) findings related to benchmarks, and indicators; (2) findings related to logistics for data collection; and (3) findings related to existing data. Findings on the questions that related to the benchmarks and indicators assisted in revising the benchmarks and indicators and developing examples of performance standards. The revised documents were presented to Validation Panels (described below). Logistical findings (what did and did not work well in the pilot study) were incorporated into Westat’s recommendations for data collection in the full-scale evaluation.


Findings on the use of existing data from reports to ADD led Westat to conclude that very little of what is currently included in reports to ADD will be able to be used in the full-scale evaluation. For the most part, definitions and report format varied among all programs; reports covered different time periods than the indicators will be addressing; and many of the indicators were not addressed in these reports (particularly outcomes). Although these reports were useful in providing interviewers with background on each program, they were not useful for measuring the indicators for the ADD independent evaluation. Chapter 5 of this report makes recommendations on how ADD might revise reporting requirements so that reports to ADD might be able to be used in future national-level evaluations.



  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

ADD

Jennifer Johnson

202-690-5982

[email protected]



Westat

Lynn Elinson

412-421-8610

[email protected]


1Criteria for the UCEDD programs included: geographic diversity (e.g., rural/urban); geographic region of the United States (west, midwest, and east); participation in MTARS in 2005 – 2007; organizational structure (is and is not in a medical school); and training model (e.g., is and is not a LEND program). Criteria for the DDC programs included: geographic diversity (e.g., rural/urban); geographic region of the United States (west, midwest, and east); participation in MTARS in 2005 – 2007; organizational structure (is and is not own DSA); and allotment size (e.g., minimum, non-minimum). Criteria for the P&A programs included: geographic diversity (e.g., rural/urban); geographic region of the United States (west, midwest, and east); participation in MTARS in 2005 – 2007; organizational structure (is and is not in State agency); P&A model (legal versus advocacy approach), and allotment size (e.g., minimum, non-minimum).


2 The Westat IRB is a specially constituted committee established to protect the rights and welfare of human subjects who participate in Westat projects. The IRB operates under procedures set forth in the regulations of the U.S. Department of Health and Human Services and in the Federalwide Assurance (FWA) granted to Westat by the Office for Human Research Protections (OHRP). IRB approval is required before research may begin, continue, or be changed by the research team.




22


File Typeapplication/msword
AuthorACF
Last Modified Byjjohnson1
File Modified2009-11-19
File Created2008-12-12

© 2024 OMB.report | Privacy Policy