Supporting Statement A-Community Engagement Final_revised_10 7 14_clean

Supporting Statement A-Community Engagement Final_revised_10 7 14_clean.docx

AmeriCorps Community Engagement Survey

OMB: 3045-0161

Document [docx]
Download: docx | pdf








SUPPORTING STATEMENT



Part A



Corporation for National and Community Service AmeriCorps
Community Engagement Study

Revised: September 24, 2014





Corporation for National and Community Service

1201 New York Ave., NW

Washington, DC 20525

Telephone: (202) 606-5000

[email protected]





A. Justification

A.1. Circumstances that make information collection necessary

The Corporation for National and Community Service (CNCS)’ AmeriCorps State & National program supports more than 80,000 members annually to provide a wide range of services, including mentoring, education, neighborhood renewal, health screenings, and others. Previous evaluations have confirmed CNCS’s impact on members and recipients of services such as increased education, skills, and civic participation. For the current project, CNCS will collect information on service activities and scope from AmeriCorps grantees to assess the degree to which grantees engage the communities they serve, as part of a longer term research agenda to evaluate AmeriCorps’impact on the communities it serves. CNCS also will collect information from grantee partners, which are integral in engaging and serving client communities. The current effort will be the first step in evaluatinghow well CNCS is building overall community capacity and how increasing community capacity affects change in those areas. As such, this research has two broad objectives. The first is to assess the feasibility and usefulness of collecting data on community engagement of grantees in this manner, and includes developing a valid and reliable instrument. The second objective, discussed in further detail below, is to understand how current grantees are engaging their communities, and if so, how they are working to increase community capacity. This research is intended to create a preliminary baseline, providing useful information for further research but not necessarily providing information for decision-makers, unless the results are found to be adequately valid and reliable. Should this be the case, results will inform CNCS of the range of member, grantee, and community partner efforts and activities, and potentially policy and program improvement efforts.

A.2 Purposes and Uses of the Collected Information

Survey results have the potential to provide CNCS with a useful method for assessing, as well as information on,how its grantees use resources to build community capacity, and the role of community partners in engaging communities as part of their work to build community capacity. This information will inform decisions about CNCS resource allocation and priorities. It also will make a significant empirical contribution to the broader field of civic infrastructure and capacity, community engagement, and social capital.1Studies have found that community engagement mitigates the effects of social disadvantage. Engagement is linked to economic and social capital assets, and is a principle component of community strength.2 As another example, Pope (2011) summarized research indicating that “General community connection appears to be better at reducing fear than local networks specifically designed to combat crime.”



All studies emphasized that community engagement involves soliciting input from the community about members’ goals and priorities. The goal is to increase a sense of empowerment and understanding of how to achieve change. Effective engagement requires understanding the community, gaining members’ trust and commitment to work, and work with a broad network of community partners. Mosman Council (2010) and others state that essential engagement strategies are to: 1) inform the community about the organization and the issues it targets, 2) consult to learn about community needs, priorities, and culture, 3) involve the community in organizational efforts. Community involvement is typically through volunteer work to address community needs. Volunteer work may be short- or long-term. It may involve developing informative communication to increase community awareness and engagement, service to constituents in need, negotiation with policy makers, or soliciting resources.

Smart Growth America (2013) identifies three phases to engagement: preparation, learning, and activation. Preparation involves identifying and engaging partners, developing a plan to collaborate, identifying common interests, and sharing information. Learning involves conducting background research on the community, its needs, its resources, and its culture. Activation is and building public and political support for organizational goals and pursuing those goals.

An initial step in engagement, or “preparation,” is to define target outcomes and to identify key stakeholder groups including constituencies to be served, other community-based organizations that are potential partners, businesses and institutions with an interest in the targeted outcome and/or community, and government organizations with authority over policies related to the targeted outcomes and planned organizational activities. It is especially important to identify stakeholders whose interests may conflict with the organization’s goals and to consider how to address potential conflicts (SmartGrowth America, 2013). Hess & Adams (2006) state that local community members’ knowledge, regarding community norms, geography, politics, and other aspects of community life are crucial for community engagement. IDEA, Urban Forum & NAVCA (2009) emphasize the importance of developing a comprehensive community engagement strategy that provides a framework for soliciting community input and developing partnerships. The strategy should define organizational values and principles, partner structure, targeted outcomes, priorities, resources, and planned activities. Several authors mentioned that achieving change requires recognition and cooperation from relevant local government organizations. Hess & Adams (2006) state, “…it is collective organization and governance structures that sit behind the effective claim over use and distribution of assets and skills.”

The next broad step, “learning,” is to conduct a needs assessment. Needs assessments establish a baseline for targeted outcomes (CDC, 2010). They determine community members’ levels of awareness and investment in the targeted outcomes, community assets, what resources are needed to achieve targeted outcomes, and cultural factors that can affect communication, trust, and engagement. This step provides a foundation for planning outreach, communications, and interventions that are responsive to community needs, and that community members are likely to invest in (Herefordshire Council, 2014; SmartGrowth America, 2013; Scottish Community Development Centre for Learning Connections, 2007; Kellogg Foundation, 2011). All authors agreed that engagement includes soliciting community members’ input on defining goals, and developing and conducting activities.

Pope (2011) and several other authors report the importance of inclusiveness to community engagement. If key community groups are excluded, the organization will not be credible or influential. If the group is inclusive, it develops a sense of solidarity and efficacy. Understanding of governance, policy, and mechanisms for influencing governance and policy are necessary for participation in shared decision making. When lower socioeconomic status community members lack this, they are likely to be omitted from the process of making decisions that affect their health, opportunities, built environment, safety, and quality of life.

Finally, in “activation,” organizations and their partners implement approaches to community change. The first step is to develop a plan for achieving goals. This includes proposing activities, milestones, a schedule; assigning roles and responsibilities; identifying potential barriers and approaches for overcoming them. The plan should include strategies for reaching and communicating with diverse constituencies within the targeted community, and for recruiting, training, and managing volunteers. Next, the organization implements this plan and conducts culturally appropriate outreach, communication, and action to change community policies and structures and increases resources in order to meet community needs (Herefordshire Council, 2014; SmartGrowth America, 2013; Kellogg Foundation, 2011; Rehnborg et al., 2009).



This research builds on the existing research, and applies these theories to the processes and objectives of AmeriCorps State & National. Our logic model for how AmeriCorps State & National leads to increased community civic capacity is shown in Figure 1.

Figure 1: Logic Model for AmeriCorps Impact on Community Civic Infrastructrue and Capacity

Shape1



The theory of change is that AmeriCorps members, placed in grantee organizations, will engage in three primary activities: resident engagement, development of community partnerships, and increase in organizational capacity for the grantee. These activities may be in addition to other direct service activities members provide. These activities will lead to concomitant outputs: more empowered residents, stronger service networks and capacity among organizations, and better service to the community by the grantee. Finally, these outputs will lead to increased civic infrastructure and community capacity. The focus of this research is on the processes in the above logic model, because we have no systematic knowledge about the extent to which AmeriCorps members actually engage in these activities. Assessing outputs and outcomes is not meaningful or even feasible without understanding the degree of implementation. A further refinement of our research is that we solely focus on resident engagement and community partnerships – we had initially considered measuring organizational capacity, but due to the additional burden on respondents and the complexity of organizational capacity, we have excluded this component from the current research approach.

Assessing the processes, outputs, and outcomes presents significant measurement issues. These are not simple constructs to measure, and there is not agreement among researchers and practitioners on their specific definitions. Our literature review identified at least 19 different instruments for measuring the different levels of community engagement, from academic researchers, non-profit organizations, and federal agencies. For our study, we operationalized resident engagement and the development of community partnerships (what we broadly define as “community engagement”) based on a thorough review of extant research and practitioners literature. The questionnaires used in this study seek to measure various aspects of resident engagement and community partnership development.

Understanding these mechanisms for community engagement is a step in evaluating AmeriCorps’ effects on community capacity, which is one of the ways that the program aims to improve education, housing, environmental, health, and other outcomes for vulnerable communities. Documenting service locations’ activities, perspectives, priorities, and volunteer resources provides: a) a description of the resources available to increase community capacity; b) how those resources are being used; c) how they vary between grantees and their service locations; and d) what factors are correlated with their variance.

While collecting the initial data set, researchers will document processes and assess feasibility of ongoing data collection and analysis for continual assessment of impact on community capacity building.

A.2.1 Research Questions

The survey and analysis plan are designed to answer the following questions:

  1. How feasible, useful, and effective is this data collection methodology for informing AmeriCorps on its grantees community engagement activities?

    1. What is the validity and reliability of the instrument?

  2. To what extent do AmeriCorps grantees focus their activities on engaging communities and building community capacity to solve problems? In particular, to what extent are grantees focused on:

    1. Engaging residents?

    2. Increasing community partnerships?

  3. How do grantee partners and stakeholders perceive the community engagement efforts of grantees?

  4. What factors are related to grantee focus on community engagement? Specifically:

    1. Grantee-level factors: service focus area, grantee size (grant $ amount/MSY, total organizational budget), grantee history (years CNCS grantee; years in operation).

    2. Community-level factors: non-profit density, ethnic/racial makeup, education level, income levels, population.


A.2.2 Instrument Development

As we developed the instruments used in this study, most of the items we have created based on our project objectives and the existing research. Although we did include items used in other research, we did not find any validated items or scales. As outlined in the first research question above, this is one of the purposes of the current study. In developing the items, apart from reviewing the literature, we consulted with AmeriCorps program officers who manage the portfolio of capacity building grantees. In addition, we held semi-structured interviews with 9 grantees. The grantees were identified by program officers as grantees that are actively engaged in their communities, and our expectation was that they would be in the best position to provide information on how AmeriCorps grantees engage their communities. These interviews addressed topics including the different ways the grantees engage communities, the role of volunteers, the role of partnership building, and how AmeriCorps members interact with volunteers, partners, and community members in general. We asked about response options to various survey items, and also asked about the most effective ways to administer the survey to elicit the highest possible response rate and most reliable responses. The instrument was subsequently reviewed by various research staff in the Office of Research & Evaluation and AmeriCorps program officers, and finally was piloted, as described in section B4.

A.3 Use of Information Technology and Burden Reduction

The survey will be administered online. Eligible respondents will participate with informed consent. All responses will be confidential. Additionally, the technology to be employed can be configured to allow participants to complete as much of the questionnaire as desired in one sitting or to continue the questionnaire at another time. The technology also minimizes the possibility of participant error by electronically skipping questions that are not applicable to a particular participant, thus minimizing participant burden.

A.4 Efforts to Identify Duplication

CNCS has conducted no prior research on the overall impact of volunteer service on community engagement or capacity building for AmeriCorps State & National. Additionally, a literature review conducted in the early stages of this project found that no other comparable assessment of the AmeriCorps program contributions to community engagement exists, so we are assured that this assessment is not a duplicative effort.

A.5 Methods to Minimize Burden on Small Entities

This collection request does not involve burden to small businesses or other small entities.

A.6Consequences of Not Collecting Data

Without this information, the agency will not have comparable cross-site data on whether and how its resources are used to engage communities and support community capacity building. This information is essential for monitoring use of agency resources and assessing their impact.

A.7 Special Circumstances

This request is consistent with the general information collection guidelines of 5 CFR 1320.5(d)(2). No special circumstances apply.

A.8. Federal Register Notice and Outside Consultations

A.8.a. Federal Register Notice

As required by 5 CFR 1320.8(d), a notice was published in the Federal Register on March 4, 2014, page number12187, for 60 days. There were no comments during this review period. However, upon consultation with our stakeholders regarding the focus of this research, the title of the study was changed to “The Corporation for National and Community Service AmeriCorps
Community Engagement Study,” from “AmeriCorps State and National Community Impact Survey.” This change is reflected in the introduction and instruction for the survey instruments as well.

A.8.b. Outside Consultations

CNCS contractor AFYA, Inc. conducted:

  • Nine interviews with AmeriCorps grantees to obtain preliminary information about their partnership approaches and the value of partnerships. This information was used to help inform the survey instrument and the recruitment strategy

  • A literature review on existing community engagement measures and research, which informed development of survey items.

  • A pilot test of the survey to ensure clarity and minimal burden.

Additionally, a number of AmeriCorps Program Officers were engaged for input on the assessment strategy, the survey approach, and the design of the survey instruments (see Attachment E for a list of those consulted both within and outside the Agency thus far).

A.9 Explanation of any Payment or Gifts to Respondents

No gift, incentive or payment will be offered or given to the respondents.

A.10 Assurance of Confidentiality Provided to Respondents

Respondents will be notified of privacy and non-disclosure regulations governing the collection and use of these data, and the purpose for the collection. All respondents will be informed that participation is voluntary. All survey instruments used in this study will include the following disclosure:

PRIVACY ACT NOTICE: The Privacy Act of 1974 (5 U.S.C § 552a) requires that the following notice be provided to you: The information requested in the AmeriCorps Community Engagement Survey Form is collected pursuant to 42 U.S.C 12592 and 12615 of the National and Community Service Act of 1990 as amended, and 42 U.S.C. 4953 of the Domestic Volunteer Service Act of 1973 as amended, and 42 U.S.C. 12639. Purposes and Uses - The information requested is collected for the purposes of assessing the degree to which grantees engage the communities they serve, as part of a longer term research agenda to evaluate AmeriCorps’ impact on the communities it serves. CNCS also will collect information from grantee partners, which are integral in engaging and serving client communities. Routine Uses - Routine uses may include disclosure of the information to federal, state, or local agencies pursuant to lawfully authorized requests. The information may also be provided to appropriate federal agencies and Department contractors that have a need to know the information for the purpose of assisting the Department’s efforts to respond to a suspected or confirmed breach of the security or confidentiality or information maintained in this system of records, and the information disclosed is relevant and unnecessary for the assistance. The information will not otherwise be disclosed to entities outside of the Corporation for National and Community Service without prior written permission. Effects of Nondisclosure - The information requested is not mandatory.

A.11 Justification of Sensitive Questions

This project includes no questions of a sensitive nature. The evaluation instruments do not contain any questions concerning sexual behavior and attitudes, religious beliefs, personal income, or proprietary business information.

A.12 Estimated of Hour Burden Including Annualized Hourly Costs

This study has 3 surveys. The grantee survey will take approximately 15 minutes (0.25 hours) to complete, and approximately 304 respondents will participate, resulting in 76 burden hours. The service location survey will take approximately 30 minutes (0.50 hours) to complete, and approximately 394 respondents will take it, resulting in 197 total burden hours. The partner survey will take approximately 10 minutes (0.17 hours) to take, and approximately 1,970 respondents will take it, resulting in 328 burden hours. This time will be considered part of respondents’ work and will not involve monetized cost.



Exhibit 1:  Estimated annualized burden hours

Form Name

Number of Respondents

Number of responses per respondent

Hours per response

Total Burden hours

Grantee Survey(time point #1)

304

1

15/60

76

Grantee Service locationSurvey
(time point #2)

394

1

30/60

197

Partner Survey (~5/service location)
(time point #3)

1,970

1

10/60

328

Total

2,564**

NA

NA

456

**Estimated total number of unique respondents.



Exhibit 2: Estimated annualized cost burden

Form Name

Number of Respondents

Total Burden hours

Average Hourly Wage Rate*

Total Cost Burden

Grantee Survey (time point #1)

304

76

$36.30*

$2,758.30

Grantee Service location Survey
(time point #2)

394

197

$36.30*

$7,042.20

Partner Survey (~5/service location)
(time point #5)

1,970

328

$36.30*

$11,906.40

Total

2,564

456

NA

$21,706.9

* Average hourly wage based on the weighted average of wages for the variety of respondent occupational categories expected to participate in this survey: 1 Community and Social Service Occupations (21-000, $19.86), 1 Management Occupations (11-0000, $47.83), 1 Business and Financial Operations Occupations (13-0000, $29.97), 1 Administrative Services Manager (11-3011, $37.61), and 1 Education, Training, and Library Occupations (25-0000, $46.23).Data Source: National Occupational Employment and Wage Estimates in the United States, May 2012, “U.S. Department of Labor, Bureau of Labor Statistics” (available at http://www.bls.gov/oes/current/naics4_621400.htm)



A.13 Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

The only cost to the respondent will be that associated with their time to respond to the information collection, as shown in Exhibit 2.

There are no other costs to respondents and no respondent recordkeeping requirements.

A.14 Estimates of Annualized Costs to the Federal Government

The contract to conduct the survey and assess feasibility of ongoing surveys was issued to AFYA, Inc. under Contract No: GS-10F-0309L. The costs associated with administering the survey and assessing feasibility for an ongoing survey is $107,959. There are no other costs to the Federal Government.

A.15 Explanation for Program Changes or Adjustments

No change in burden is requested. This submission to OMB is for an initial request for approval.


A.16 Plans for Tabulation and Publication and Project Time Schedule

A.16.1 Analysis Plans

The analysis of the community engagement survey is based on the research questions. All of the survey items are either nominal or categorical. Contingency tables, with χ2 statistics, will be used to estimate univariate and bivariate statistics, to compare response patterns across groups of grantees. For ordinal variables, linear regressions or ordinal logit models (after testing for proportional odds) may be used to compare groups of grantees.All statistical analysis will use survey weights, using appropriate variance estimation methods (e.g. Taylor-series linearization).

One limitation in the analysis is the large number of statistical tests we plan on conducting. It is well known that researchers run the risk of inflated Type I errors when conducting multiple tests, and appropriate adjustments must be made to reduce this risk. If this were confirmatory research, we would employ appropriate adjustments to the p values, such as Bonferroni adjustments or control of the false discovery rate.3This research is exploratory, however, and following Bender and Lange (2001),4 we are more concerned that adjusting the p values will be overly conservative and risk not uncovering useful information for further research to explore.



Question 1:How feasible, useful, and effective is this data collection methodology for informing AmeriCorps on its grantees community engagement activities?

    1. What is the validity and reliability of the instrument?


To address this question, we will consider the following factors in making recommendations for future data collection efforts:

  • Response rates to each questionnaire, as well as the representativeness of the final sample to the population.

  • The level of effort required to conduct gather the information and conduct follow-up

  • The variability in response patterns on key items, and how much responses patterns correlate with grantee characteristics. Sufficient variance in responses is necessary to ensure that we can identify substantive differences among grantees to provide useful information. Correlation with grantee characteristics is important because it will allow us to make systematic comparisons among grantee types.

  • The validity and reliability of the instrument. In this data collection effort, the construct validity of survey items and constructs will be assessed, to ensure that we are able to adequately triangulate complex concepts. Construct validity will be measured using confirmatory factor analysis, where the items identified in Table 1 above will be indicators. We will use common goodness of fit indices (such as χ2 statistics, the Comparative Fit Index, and Root Mean Squared of Error Approximation). We will also assess measurement invariance, to determine whether the constructs have the same measurement qualities across various subgroups, following the procedures outlined by Meredith.5Assuming we are able to ensure sufficient construct validity, later efforts may address predictive validity through other types of data collection (which may include qualitative data collection). Measurement reliability (i.e. internal consistency) of the constructs will be assessed using appropriate and widely used metrics, such as Cronbach’s α or McDonald's ω.6


Questions 2 and 3:

Question 1:To what extent do AmeriCorps grantees focus their activities on engaging communities and building community capacity to solve problems? In particular, to what extent are grantees focused on:

    1. Engaging residents?

    2. Increasing community partnerships?


Question 3. How do grantee partners and stakeholders perceive the community engagement efforts of grantees?


To address these questions, the survey uses various measures highlighted in Table 1. Some of these items are intended to triangulate measurement of underlying constructs (which will be tested and validated), and these are indicated in the table with an asterisk (*). While our research team developed many of the measures based on theory and practice within our programs, some of the items are adapted from scales created by other researchers, and these are indicated in the table. Many of these measures are only comparable across grantees upon conditioning on other variables – particularly population size and number of potential partners in the geographical area of the grantee service locations. We will use population at the Census tract and county level from the US American Community Survey, as well as the number of non-profits from the National Center of Charitable Statistics

Table 1. Conceptual Dimensions, Survey Items, and Analytical Methods

Component of Engagement

Dimension

Grantee Survey Item

Service location Survey Item

Partner Survey Item

Analysis Method

Resident engagement

Targeting opportunities towards local residents*


1, 2, 5, 6 (as AmeriCorps members)

4d, 13 (as volunteers)

4b,f (in general)


Contingency tables with χ2 statistics, modes, and medians.


Confirmatory Factor Analysis to test construct

Resident engagement

Aspects of volunteer use, recruitment, management, and training


12, 14 (number and freq of vols)

15 (skill level of vols)

16, 17 (training)

18, 19, 20 (role)


Contingency tables with χ2 statistics, modes, and medians.


Partnership development

Aspects of partnership development


21, 22 (types of partners)

23 (number of partners)

24,25,26,27 (role of partners)

29, 30, 31 (role of AC members in partnerships)


Contingency tables with χ2 statistics, modes, and medians.


Partnership development

Stages of partnership/collaborationa*


26, 27

2 (partnership level)

6, 7, 8


Contingency tables with χ2 statistics, modes, and medians.


Confirmatory Factor Analysis to test construct

Overall

Stages of engagementb*

1,

3 (goals)

7, 8, 9, 10 (planning and input)

2

Contingency tables with χ2 statistics, modes, and medians.


Confirmatory Factor Analysis to test construct

Overall

Degree of Engagement*


1, 2 (AC members from community)

10 (community input)

12, 13, 18 (role/importance of community volunteers)

23, 25, 26, 27 (partnerships)

2 (partnership level)

3-5 (AC awareness and involvement)

7, 8, 10, 12

Contingency tables with χ2 statistics, modes, and medians.


Confirmatory Factor Analysis to test construct

Overall

Role of AmeriCorps


3, 4 (in general)

28-31 (partnerships

3-5, 9, 10, 12 (AC awareness and involvement)

Contingency tables with χ2 statistics, modes, and medians.

Overall

Planning

1 (planning)

5, 6 (communication with and homogeneity of services among locations)

7-10


Contingency tables with χ2 statistics, modes, and medians.

*Items in this section represent scales or constructs to be assessed for validity (via factor analysis) and reliability.

a These items are based on research by Woodland, R. H., & Hutton, M. S. (2012). Evaluating Organizational Collaborations Suggested Entry Points and Strategies. American Journal of Evaluation, 33(3), 366–383; Thomson, A.M., Perry, J.L. & Miller, T.K. (2007). Conceptualizing and measuring collaboration. Journal of Public Administration Research and Theory, 19, 23-56; Granner, M.L. & Sharpe, P.A. (2004). Evaluating community coalition characteristics and functioning: A summary of measurement tools. Health Education Research and Practice, 19, 514-532.

b These items are based on research by Dugdale, A. et al. (2012). Developing and testing a tool for measuring capacity building. ACIAR Impact Assessment Series, No. 79, Canberra: Australian Centre for International Agricultural Research; Hawe, P. et al. (2000). Indicators to help with capacity building in health promotion. New South Wales Health Department; Hess, M. & Adams, D.J. (2006). New research instruments for government: Measuring community engagement. In Duke, C., Doyle, L. & Wilson, B. (Eds.) Making Knowledge Work: Sustaining Learning Communities and Regions. Leicester, UK: National Institute of Adult Continuing Education.


If these measures are found to adequately reflect constructs, we will assess both the constructs and the measures. Otherwise, we will only assess the measures. Similar questions are present on the questionnaires for Grantees, Service Locations, and Community Partners, which were included to triangulate the measurement. It is not expected that all three respondent groups will have the same answers; each has unique perspectives that are limited by their own familiarity with and visibility over the work AmeriCorps members are doing. However, we do expect some measure of positive correlation, and the extent that this correlation differes systematically by type of grantee may indicate important differences in community engagement practices. We recognize that there will be confounding factors, and we will assess the reliability of the measures across respondent population.


Descriptive statistics will be calculated for the measures and constructs related to resident engagement and partnership development, including measures of central tendency (means, medians, modes), variance, and skew, and we will plot histograms to examine their distribution. These results will be used to describe the extent that grantees are focusing on different community engagement methods.


Question 4:What factors are related to grantee focus on community engagement? Specifically:

  1. Grantee-level factors: service focus area, grantee size (grant $ amount/MSY, total organizational budget), grantee history (years CNCS grantee; years in operation).

  2. Community-level factors: non-profit density, ethnic/racial makeup, education level, income levels, population.



To address this question, we will use contingency tables as well as regression models. Given the large number of survey items, we expect to focus the analysis for question 4 on the key survey items and constructs in each dimension, and explore further items if substantive differences are found.



For bivariate relationships between an item and a categorical variable, such as service focus area, we will use contingency tables with χ2statistics to test for statistical significance. Continuous variables, and for modeling of multiple variables, we will use standard regression models, the specific model depending on the type of dependent variable.



In addition to the survey data, we have collected administrative data on grantees, and we are also collecting data from other government sources on the characteristics of grantees as well as the communities where members are serving. These additional data sources are listed in Table 2.



Data Source

Example Variables

CNCS Data Warehouse

Grantee characteristics, e.g. focus area, geographical location, grant size, number of members

US Census/American Community Survey

Demographics of communities served, e.g. population, racial/ethnic diversity, median income, education levels

National Center for Charitable Statistics (IRS 990 database)

Grantee tax filings, including revenue and income

Community Health Status (CDC)

Health patterns in counties, e.g. life expectancy, doctor shortage

Common Core of Data (Dept of Ed)

School statistics, e.g. Free/Reduced price lunch, Title I status, receipt of School Improvement Grant/Persistently Low Achieving

BLS

Unemployment rates

Integrated Postsecondary Education Data System (Dept of Ed)

College statistics, e.g. enrollment rates



We expect the results from the analysis for Question 4 to identify key factors correlating with different levels and types of community engagement. Reporting of results will emphasize that these are not causal findings, but descriptive and exploratory. Should the measures be found to be valid and useful, and findings in this question substantive and informative, we expect this step to be useful in two areas. First, the results will assist our program offices in better understanding their portfolio and the types of grantees that systematically conduct more community engagement. Second, it can provide direction for further research into community engagement.



A.16.2 Publication Plans

Study results will result in an internal report documenting the feasibility, utility, and effectiveness of this type of study, and the validity and reliability of the instruments. It will also report on results of the data collection to the extent that they are found to be reliable and valid. If the study has sufficient validity, reliability, and utility, results will be disseminated through peer-reviewed publications and presentations at professional conferences. While this study does not have the potential to identify causal impacts of AmeriCorps programs on community engagement, or any validation that community engagement leads to the expected outputs and outcomes, this research does have utility to the broader field of research in community engagement, social capital, and civic infrastructure and capacity. As stated above, there are numerous instruments to measure community engagement; most have not been validated, nor have they been applied on the scale that we propose using the instruments in this study. If our instruments are found to be valid and reliable, if we are able to use them to adequately present a picture of the broad ways AmeriCorps grantees engage their communities, and identify organizational characteristics related to different levels and types of engagement, this would provide a valuable contribution. Manuscripts and presentations will clearly state the limitations of the study findings including the lack of generalizability of the specific results associated with the research methods.

A.16.3 Project Timeline

The project timeline is shown in Exhibit 3 below.

Exhibit 3: Project Timeline

Data Collection and Analysis

Timeframes

Administer Survey

Grantee Survey

September 2014

Service Location Survey

October 2014

Partner Survey

November 2014

Data Analysis

Data cleaning and preparation of data analysis file

December 2014

Data analysis

December 2014 through January 2015

Preparation of manuscript for publication

January 2015 through March 2015



A.17 Display of OMB Expiration Date

CNCS does not seek this exemption.

The OMB expiration date will be displayed on the introductory page of the survey instrument.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

CNCS is not requesting an exception to the certification requirements.

List of Attachments

Attachment A: Grantee Survey

Attachment B: Service Location Survey

Attachment C: Partner Survey

Attachment D: Federal Register Notices

Attachment E: Technical Consultants

Attachment F: Survey Communication Materials

1 See e.g. Pope, J. (2011). Indicators of community strength in Victoria: Framework and evidence. Department of Planning and Community Development. Melbourne, AU; Liberato, S.C. et al. (2011). Measuring capacity building in communities: A review of the literature. BMC Public Health, 11: 850; Mayer, S.E. (2002). Building community capacity: How different groups contribute. Effective Communities Project; Putnam, R.D. (2001). Social capital: Measurement and consequences. Isuma: Canadian Journal of Policy Research, 2, 41-51.

2Hess, M. & Adams, D.J. (2006). New research instruments for government: Measuring community engagement. In Duke, C., Doyle, L. & Wilson, B. (Eds.) Making Knowledge Work: Sustaining Learning Communities and Regions. Leicester, UK: National Institute of Adult Continuing Education.


3Benjamini, Y., & Hochberg, Y. (1995). Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing. Journal of the Royal Statistical Society. Series B (Methodological), 57(1), 289–300.

4Bender, R., & Lange, S. (2001). Adjusting for multiple testing—when and how? Journal of Clinical Epidemiology, 54(4), 343–349.

5Meredith, W. (1993). Measurement invariance, factor analysis and factorial invariance. Psychometrika, 58(4), 525–543.

6McDonald, R. P. (1999). Test Theory: A Unified Treatment. New York: Psychology Press.

9


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRobin
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy