IMLSPAC_PRA part_A 2.18.09

IMLSPAC_PRA part_A 2.18.09.doc

The Study of Free Access to Computers and the Internet in Public Libraries

OMB: 3137-0078

Document [doc]
Download: doc | pdf

Institute of Museum and Library Services

The Study of Free Access to Computers and the Internet in Public Libraries

Supporting Statement for PRA Submission

ICR Reference No. 200901-3137-001

A. Justification

A1. Need and Legal Basis

The advent of the Internet and computer technology has radically changed the way people live. Public libraries have been at the forefront of championing digital inclusion through partnerships with the Bill & Melinda Gates Foundation, the IMLS, other national organizations, and their own communities. As a result, virtually every library in the U.S. provides public access computing to their community. This includes not only digital resources, databases, networked and virtual services, but also the training, technical assistance, and staff knowledgeable in technology services as support for the users. These technologies “range from basic services (computers, Internet, and online catalogues) to sophisticated, interconnected technologies that bring digital resources and virtual services to patrons” (IMLS, 2007).

While these services are widely available, past research has produced little evidence that shows a relationship between public access computers (PAC) and community benefits. Past decision-making regarding public access computer services has been based on such measures as number of users/sessions, length of time computers are in use, number of unfilled requests, and results of satisfaction surveys (e.g., Jaeger, Bertot, & McClure, 2007). In many ways, these measures are also reminiscent of those lamented by Zweizig and Dervin (1977) as being inadequate for understanding library services in communities. To recognize and communicate the value of free access to public access computers in libraries as it is accrued to individuals, their families and communities, better methods that ask different questions and yield robust data are needed that will empower library decision-makers, especially in efforts to guide social policy.

The Study of Free Access to Computers and the Internet in Public Libraries study will seek to fill the gaps in previous research in this area by answering the following questions:

  1. What are the demographics of people who use computers, the Internet, and related services in public libraries?

  2. What information and resources provided by free access to computers, the Internet, and related services in public libraries are people using, across the spectrum of on-site and off-site use?

  3. How do individuals, families, and communities benefit (with a focus on social, economic, personal, and professional well-being) from free access to computers, the Internet, and related services at public libraries?

  4. What reliable indicators can measure the social, economic, personal, and/or professional wellbeing of individuals, families, and communities that result from access to computers, the Internet, and related services at public libraries?

  5. What correlations can be made between the benefits obtained through access to computers and the Internet and a range of demographic variables?

  6. What computer and Internet services and resources are lacking at public libraries that, if provided, could bring about greater benefit?

  7. What indicators of a negative relationship between users of PAC and their social, economic, personal, and/or professional quality of life can be identified where free access to computers and the Internet is weak or absent?

Since little previous work has been done on the use of PAC, this study is meant to provide a beginning point for exploration of these questions and to develop a framework for outcomes-oriented performance measurement based on broadly applicable indicators of how the public benefits from PAC resources in public libraries. A recent report issued by the Urban Institute (Lampkin et al., 2006) serves as a model for our approach to developing and validating these indicators. Figure 1, taken from this report, shows that the development of solid indicators for areas of interest to an organization’s work is a multi-stage process, and we feel that this study is at the very beginning of this process. Until we have data that can show that particular indicators are worthwhile pointers to contribution in specific policy domains, there is little that can be done to advance beyond the anecdotal approach that is the best we have at the present time for discussing the use of PAC.


Figure 1: Outcome sequence chart—creating a common framework for measuring performance


To develop and validate specific indicators to be used for performance evaluation, we will use a mixed methods approach to gather and analyze data related to PAC outcomes. The collection will take place in two concurrent phases: a nationwide representative telephone survey; and case studies in four US public libraries. This approach will generate generalizable quantitative data on the extent and distribution of PAC use and contextual data that will provide an enriched understanding of how users use public access computing and the role it fills in their everyday information environment, as well as information on how library policies and available resources affect the extent of outcomes. Table 1 presents the five working hypotheses that informed the development of these research methods and corresponding survey instruments.

Table 1: Hypotheses Crosswalk

#

Hypothesis

Survey questions

1

There are demographic characteristics that define users of computers, the Internet, and related services in public libraries. These demographic characteristics may be used to better understand how computers, the Internet, and related services inform the development of services in libraries to meet the needs of populations of users.

Demographic questions Q13 through Q26

2

The availability of free access to computers, the Internet, and related services at public libraries can provide social and economic benefits to individuals, families, and communities which enhance personal and professional quality of life.

Domain questions M1 through D6; case study interviews

3

The social, economic, personal, and/or professional services related to providing free access to computers, the Internet, and related services at public libraries can be measured at the individual, family, and community levels.

Domain questions M1 through D6

4

Correlations can be made between the use of access to computers and the Internet and a range of demographic variables. Correlations can be made to type, level, or volume of related services to measured use.

Demographic questions Q13 through Q26; domain questions M1 through D6

5

Computer and Internet services and resources are lacking at public libraries that, if available, could bring about greater benefit. Indicators of a negative influence on users can be identified where free access to computers and the Internet is weak or absent.

Case study interviews; analysis of correlations between demographic questions Q13 through Q26 and domain questions M1 through D6



For hypotheses 2 through 5, six indicator domains were identified as potentially valuable because of their applicability to the research questions guiding this study and their interest to policymakers (Table 2). The indicators themselves are representative of the kinds of activities that have been shown to contribute to changes in the areas identified within the domain. We do not expect to show causality, but would hope to show contribution, as described in Van Den Berg’s recent exploration of this area: “…in the case of results at the level of society, the public debate should move from the concept of linear causality to the concepts of conditionalities (necessary but not sufficient conditions for change to occur). Furthermore, it should be made clear that these necessary but not sufficient conditions contribute to rather than cause the change to take place” (2005).


Table 2: Indicator domains and definitions


Domain

Definition

Civic engagement

Individual and collective actions using public access computers, designed to identify and address issues of public concern, including efforts to work with others in a community to solve a problem or interact with the institutions of representative democracy.


eCommerce/eBusiness

The act of buying and selling goods and/or services using the Internet through a public access computer; engaging in activities through PAC that help consumers gain better knowledge of products and services.


Education

The use of public access computers to help gain knowledge and interact with services related to early childhood education, K-12, colleges or universities, graduate schools, adult education, and continuing education


eGovernment

The use of public access computers to make use of online government services and gain information and assistance for legal and regulatory questions.


Health

Seeking information or transacting business related to individual or family health care and associated issues through public access computers.

Employment

Using the Internet and public access computers to seek work, gain job-related skills or information through training or research, produce documents such as resumes or other job-seeking materials.

Social inclusion

The use of public access computers to pursue personal or socially meaningful ends, including the use of websites which connect users to others directly and synchronously, such as IM, chatrooms, and chat-enabled games; directly and asynchronously, such as Facebook, MySpace, message boards, and personal email; or indirectly, such as through user-posted content on sites such as YouTube or personal blogs.






The telephone survey and case study interview guides were developed in consultation with a committee of experts from library science, as well as those from the educational, health, nonprofit, governmental, and business sectors The advisory committee was convened early in the development of our instruments, and we used their direct feedback to develop and refine the policy domains selected for inclusion in the surveys, as well as for review of the individual questions in the survey over a 6 month period. Members of the expert committee include:

  • Rick Ashton, Chief Operating Officer, Urban Libraries Council

  • Michael Barndt, Data Center Analyst, Nonprofit Center of Milwaukee

  • Susan Benton, Strategic Partners Executive, City/County Management Association (ICMA)

  • John Carlo Bertot, Professor and Associate Director, Information Use Management & Policy Institute, Florida State University

  • Cathy Burroughs, Associate Director, National Network of Libraries of Medicine

  • Sarah Earl, Acting Director, International Development Research Center Evaluation Unit

  • Wilma Goldstein, Senior Advisor for Women’s Issues, Small Business Association

  • Jaime Greene, Program Officer, Bill & Melinda Gates Foundation

  • Carla Hayden, Executive Director, Enoch Pratt Free Library

  • Peggy Rudd, Director and Librarian, Texas State Library and Archives Commission

  • Ross Todd, Associate Professor and Director, Center for International Scholarship in School Libraries, Rutgers University

  • Bernard Vavrek, Director, Center for the Study of Rural Librarianship, Clarion University of Pennsylvania

In addition, we used the expertise of Glen and Leslie Holt, both well-respected practitioners and researchers in the library community, as consultants to review our work to make sure it was meaningful within the library context. Several already-established indicator sets were used as a starting point for identifying specific questions related to the research domains (Table 3).


Table 3: Initial Indicator Sets Used to Develop Survey Instruments


Source

Year

Topics Covered

URL

Urban Institute:

National Neighborhood Indicators Project


1996

Education; economic development; health literacy; immigrant acculturation


http://www2.urban.org/nnip/


World Bank:–

Social Capital Initiative

World Values Survey


1997

2005


Economic development

Social connectedness

http://web.worldbank.org/WBSITE/EXTERNAL/TOPICS/EXTSOCIALDEVELOPMENT/EXTTSOCIALCAPITAL/0,,contentMDK:20193049~menuPK:994384~pagePK:148956~piPK:216618~theSitePK:401015,00.html,

http://www.worldvaluessurvey.org/

Search Institute

2007

Education; social connectedness


http://www.search-institute.org/assets/

Childstats.gov

2007

Health; economic development; literacy


www.childstats.gov

City of Seattle

Information Technology Indicators Project

2004

Education; economic development; social connectedness


http://www.seattle.gov/tech/indicators/




The questions were further developed to specifically gather metrics about how users are interacting with PAC, and what sorts of activities they are actually accomplishing within the research domains. Our questions go beyond asking if the activity took place to asking about whether a concrete result which that activity may have contributed to occurred—moving from an output measure to an outcome measure.

To link these indicators closely to the work conducted to-date on PAC in libraries, we have also designed our instruments to allow cross-correlation with data collection provided in such work as the PEW Internet & American Life Studies (Estabrook, Witt & Rainie, 2007), NCES data (Chute & Kroe, 2007), and that conducted at Florida State University (Bertot, McClure, Jaeger & Ryan, 2006). Where possible, we have used previously validated questions from US Census studies. We believe that this integration will provide the missing link in the current picture, and allow more direct exploration of the true usage of PAC in libraries and the communities and populations they serve.

Using the area of economic development, the following example demonstrates how the project will develop and validate indicators using the mixed methods approach. Table 4 presents a list of indicators related to the concept of economic development derived from projects conducted by the Urban Institute, the World Bank, Gates Foundation CAT programs, Casey Family Programs, and the City of Seattle.


Table 4: Indicators related to economic development


  • Opportunities to upgrade job skills

  • Learn software programs

  • Employment and finding work

  • Learn how to write a resume

  • Career planning skills

  • Work life skills

  • New business creation

  • Marketing opportunities for small business

  • Increased bandwidth and infrastructure for low-income business

  • Government services available to employers/employees

  • Money management skills




Table 5 shows how two of those indicators can be expressed as variables on multiple instruments. This is the process used to move from the indicator sets to the instruments that will be used in the two data collection methods of the study.


Table 5: Examples of potential variables for economic development indicators


Study Method

Opportunities to Upgrade Job Skills

Potential Indicators

Employment & Finding Work

Potential Indicators

Telephone Survey

Have you ever attended a class or workshop at the library to learn computer skills for work related activities? YES / NO


[If YES]

Did you learn any of the following skills?

  • How to create a resume

  • How to create a spreadsheet

  • How to create a webpage

  • Other:


Percent of users using PAC for job-related training



Percent of users gaining job-related skills through PAC


Have you ever used a computer at the library to search for job openings?


Have you ever filled out a job application online at the library?



Have you ever used a computer at the library to create a resume?


Percent of users searching for jobs through PAC


Percent of users filing job applications through PAC


Percent of users creating resumes through PAC


User Interviews & Focus Groups


Tell me an example of something you have learned about computers or the Internet from being here at the Library.


How did this help you at work? (or, how do you think it will help?)

Potential indicator of users gaining job-related skills through PAC

Have any of you used computers in the library to help you find a job?


Tell us about how you used the computer during that job search.


How did it help you? What else could have helped?


Potential indicator of users searching for jobs through PAC

Public library staff interviews & focus groups

What types of computer training or instruction to assist in development of work-related skills do you offer?


How do you publicize these opportunities?



(provides insight into missing indicators, and examples of practices supporting high-value indicators)

How have you assisted people who were looking for work?


What barriers do people encounter when using PAC to find jobs?


(provides insight into missing indicators, and examples of practices supporting high-value indicators)

Local agency & government staff interviews

Do you refer people to the library to learn computer skills to enhance their job skills? Under what circumstances? How often? How does this affect your workload? What kind of feedback do you receive from the people you have referred?

(provides insight into missing indicators, and examples of practices supporting high-value indicators)

Do you refer people to the library to use the computers or get access to the Internet when they are looking for work? How often? What has been their experience?

Do you offer services through the library that help people to find work?

(provides insight into missing indicators, and examples of practices supporting high-value indicators)



Validating the indicators generated through this data collection effort requires contextualizing the complex relationship between public access computing and measurements of change in individuals, families and communities that may be influenced by experiences with PAC. Our approach to validation is based on the use of a situated logic model, which places the indicators related to public access computing in the larger policy context surrounding the activity (Naumer, 2009). The situated logic model provides a framework which allows us to relate relatively simple indicators to relevant policy efforts, through bridging between the outputs or outcomes related to PAC being measured and the broader changes measured in policy work.

An example of the situated logic model approach is presented in Figure 2. This diagram shows how public access computing is situated within the context of workforce development. This approach recognizes that public access computing outputs, such as providing access to technology, are the basis for activities within a workforce development program. In this example, online job training is an activity that workforce development programs use to train workers. This activity is enabled by access and technology skills provided through public access computing. Therefore, an output of public access computing is to provide access and the skills necessary to benefit from online job training. The result of this activity may be that participants in the program qualify for a new set of jobs. The workforce development output may then be considered an outcome for public access computing centers. The workforce development program outcome of placing a participant in a new job may be considered an indicator for the use of public access computing. Additionally, aggregated results of placing more qualified workers in the workforce to create an improved workforce may, by proxy, be considered an result of public access computing.


Figure 2: Situated logic model example





The mixed method approach employed in this study is particularly suited to validating outcome indicators, as well as identifying additional areas for further research. The results from the telephone survey will be examined in light of our qualitative case study data, not necessarily from the “staff” perspective, but the “user” perspective and the “funders” or “stakeholders” perspectives. One of the strengths that we gain by including these qualitative explorations of the user’s and stakeholders own estimation of the influence on themselves and their community is a better grasp of the conditionalities and contextual influences that might be important to consider when the survey results are interpreted.

An example of how the two approaches will help us to understand different areas of PAC and the ways that the results will inform the development of indicators related to intermediate and end outcomes is shown in Figure 3.


Figure 3: Logic model connected with research methods






As seen in Figure 3, interviews conducted with policymakers and library staff in conjunction with the case studies will provide a link between inputs, like funding and number of PAC terminals, with outputs or activities, such as the number of user sessions. Although statistical data is collected with regards to some PAC inputs and outputs, the context which informs resource allocation policy decisions is largely absent from the literature. This information is important for understanding the how the indicators can be used to more effectively accomplish program outcomes while meeting policy objectives.

The telephone survey links user activities to intermediate outcomes. Used in this way, the telephone survey will mostly quantify user activities (at a level that we can expect to see statistical significance based on our sample size). To a somewhat lesser extent, it will provide statistical evidence of intermediate outcomes, especially those related to more common activities, such as the link between using email and keeping in touch with friends or family. We do not expect that the telephone survey will be able to reliably find indicators of end outcomes such as social inclusion. The telephone survey will also not be able to link library inputs to activities or intermediate outcomes.

The case study interviews and focus groups link user activities to intermediate and end outcomes. The case studies are particularly useful for identifying evidence of end outcomes which are subjective (e.g. quality of life). The case studies also aid in the confirmation and comprehension of quantitative associations, capturing levels of subjective importance, and revealing other questions that should be asked in subsequent evaluations. More detail on the selection of cases and interview subjects is available in Part B of this submission.

Clearly, additional work will be needed to validate these insights, but without this initial investigation, there will be no place to start.

A2. Purpose and Use of Collected Information

As explained in Section A1, the purpose of this data collection and analysis is: (a) to develop robust and broadly applicable social, economic, educational, and health indicators for the use of free access to computers, the Internet and related services in public libraries on individuals, families and communities; and (b) to apply those indicators to validate their robustness and document positive and/or negative results from the presence or absence of key public access computer resources and services in public libraries.

The IMLS will use this study to fulfill its statutory mandate to analyze trends and share best practices to improve library services. The study results and performance indicators will be disseminated to grantees and potential applicants in order to improve future grant applications and as tools to enhance program development at the local level. A recent analysis of the agency's Library Grants to States Program shows that libraries throughout the nation use federal funds to develop technology infrastructure to support a range of activities that strengthen communities such as providing literacy programming for adults and children, offering homework help, and purchasing access to databases and indexing services that cover a wide range of substantive areas. The instruments developed through this research project could be used by state and local libraries to assess user need and make better resource allocation decisions, thereby making better use of public investments.

The information will also help the agency developed tailored technical assistance modules on research and program evaluation. IMLS convenes an annual technical assistance training conferences for state level administrators of the Library Grants to States program and for state data coordinators and develops asynchronous web training on program evaluation for the library community. These convenings provide library administrators and program development staff with tools for program evaluation and monitoring in their home states. The results of this study, which focuses more directly on the user experience with PAC in libraries, will be used to highlight the importance of user-centered outcome metrics in the evaluation of library service, rather that administrative input and output metrics such as circulation counts, PAC user number of PAC user sessions, etc.

The Institute plans to promote study findings widely among library and researchers in other fields and encourage future studies through its National Leadership Grant program. Findings regarding use area will provide information needed to develop more strategic partnerships with other national organizations. For example, use of library PAC resources for employment and health information services would be noteworthy to other government agencies such as the Department of Labor, which sponsors One Stop Employment Centers and the Center for Disease Control and Prevention which invests in health information campaigns across the country.

The information collected through the course of this study will also be relevant to schools offering programs in library and information science, where many library professionals obtain their educations. This study has implications for segments of the national public who have limited access to Internet and other information resources. Overall, the information collected for this study will help libraries and public access computing centers provide better service and information resources to the public.

A3. Use of Technology in Collection of Information

Data collection will take place in two stages: (1) a national telephone survey; and (2) case studies of 4 US public libraries.

The telephone survey will be conducted by Telephone Contact, Inc. (TCI). The survey will employ a dual frame probability sample of households that combines a List Assisted random digit dialing (RDD) sample procedure with a Cell Phone exchange sample. The objective of the dual frame design is to increase the overall coverage of US households in the survey because cell-phone-only households represent roughly 1 in 6 households in the U.S. in 20071 (Blumberg & Luke, 2008). The overall goal is to complete 1130 interviews with users of library public access computing with approximately:

  • 890 of these respondents coming from the RDD sample frame,

  • 160 respondents from the cell phone sample, and

  • 80 from a non-response follow-up sample.

The RDD sample will also include an oversampling of telephone exchanges from low income areas in order to increase the number of interviews conducted with low income respondents.

Telephone interviewers will record survey responses via a Computer Assisted Telephone Interviewing (CATI) system that not only facilitates the administration of the screening and survey questions but streamlines the data collection process as well. The survey will require human interaction between respondents and interviewers.

The telephone survey will be an invaluable data source for developing estimates of usage rates among various important subpopulations (e.g., socio-demographic groups such as race, ethnicity, age groups, income levels, education levels, etc.). Generating such a portrait would be a major contribution to the library community and to policymakers. The primary difficulty with relying solely on the telephone survey for information about public access computer users is the low prevalence of users of public access computers in libraries (Glander & Dam, 2006)2 as well as the difficulty in gauging the public access computer resources available to respondents.

For the four case studies, interviews and focus groups will be conducted with library users, public library staff, staff of peer agencies that might refer users to library PAC, and staff of community agencies or local businesses that also provide public access computing resources. Data will be recorded by the researchers using digital audio recorders and software on portable laptop computers.

A4. Duplication of Other Information

The information that will be collected for this study is unique, and therefore does not duplicate other efforts. Over the course of an extensive literature search, the researchers could not find previously collected information that explored the use of public access computing in libraries in a systematic way.

The October 2002 CPS (Glander & Dam, 2006), while including two questions regarding PAC use in public libraries, did not ask respondents any specific questions about the purposes of their use, nor the outcome of that use. There has been no subsequent CPS survey that specifically addressed public library computer and Internet use.

The IMLS funded 2007 Pew/Internet & American Life Project (Estabrook et al., 2007) looked at how people use the Internet, libraries, and government agencies to solve 10 common problems, all of which had a potential connection to the government or government-provided information. While this research did not specifically look at how people use PAC in public libraries for the purposes of solving problems, several of the key findings from this study helped inform our research design and survey, including:

  1. For help with a variety of common problems, more people turn to the Internet than consult experts or family members to provide information and resources.

  2. Persons 18-30 years old are as likely as older adults to use libraries for help in solving problems and are also most likely to say they will use libraries in the future when they encounter problems.

  3. There was some variance in where people turned for help depending on the type of problem that people confronted.

  4. People with low-access to the Internet are poorer, older, and less well-educated than those with broadband access at home or work.

The first finding identified the importance of Internet access for solving problems and determining why people might use public access computing resources for their needs. We developed our questions regarding PAC to explore this area more thoroughly across the multiple domains, and to probe into some of the specific areas where people might be using public access computing services to seek help.

The second finding contributed to our decision to specifically seek-out young people for our study and to ensure that we were also able to capture those 14-18. The PEW study did not include persons under the age of 18 although research on minors shows distinction between different age segments therein. For example, early teens differ from late teens from mid teenagers (Meyers, Fisher & Marcoux, 2007). To ensure that we gather insights into the 14-17 year old population, we will be relying heavily on the case studies.

The variance in where people turn for help depending on the type of problem with which they are confronted helped to inform the structure of our surveys and interview guides and specifically led us to ask about Lay Information Mediator Behavior (LIMB). LIMB occurs when a friend or family member finds information for another person and is important to understanding the boundaries of use beyond the direct use of public access computing (Abrahamson, Fisher, A. Turner, Durrance & T. Turner, 2008); questions about this type of behavior are asked in several places in the telephone survey and will also be a focus of case study interviews.

The fourth finding is one driver behind the inclusion of demographic questions which will allow us to correlate population characteristics with PAC, and also informed our inclusion of questions related to other access means available to individuals using public access services in libraries.

The March, 2008 InterConnections study on the use of libraries, museums, and the Internet surveyed public library users with regards to three types of use: (1) remote online visits to public library services; (2) in-person online visits during which a library-provided workstation was used to access the Internet, online resources, or services; and (3) other in-person visits, excluding in-person online visits, but including all other in-library services and off-line use of library workstations (Griffiths & King). Although this study has some areas of overlap with the study proposed here, the relevant sections were largely focused on the services used during in-person online visits and did not inquire as to specific purposes of use that could be related to specific types of social, economic, personal, or professional changes in a user’s life as is conceptualized in the current study. Further, the purpose of the InterConnections study was largely descriptive, while the current study aims to create a framework for outcome-based performance evaluation.

Demographic questions on race and ethnicity are in compliance with OMB standards. Other demographic questions were modeled on US Census instruments, including questions on employment, housing, and language.

In sum, no studies have been previously undertaken that seek to develop indicators that can measure the social, economic, personal, and/or professional wellbeing of individuals, families, and communities that result from access to computers, the Internet, and related services at public libraries (Crandall & Fisher, 2007). Most tangentially related indicators examine broader changes in behavior that may or may not be a result of technology interventions; few studies address the relationship of technology to changes in these indicators.

A5. How Collection Impacts Small Entities

Case studies will involve analyzing the stakeholder landscape at 4 public libraries. The Chief Officers of State Library Agencies (COSLA) will provide guidance in selecting case study sites that represent different geographic regions, demographic concentrations, levels of financial support for libraries, and best practices in public access computer administration; they will also help identify libraries that might have difficulty accommodating the research team during case study visits. Participating libraries will receive $200 to support their staff’s time for participation in interviews.

A6. Impact of No Collection of Information

Once the model has been formulated, we will have a reliable basis for selecting the most appropriate indicators to use for measuring changes in the desired outcomes of public access computing at the individual, community and societal level. The final outcome will be a consistent and logical framework that can be used for both future research efforts in this area and for direct application to the desired policy outcomes of this study—providing data that can be used to support advocacy and funding for public access computers in public libraries.

The results from this study will have implications for public policy governing the provision of public access computer resources, especially for those who otherwise have limited or no access to electronic resources. Delays in collecting or not collecting this information at all would prevent researchers from understanding the ways in which public access computers are used by citizens and developing policies that are responsive to the needs of communities. The collection of this information is also warranted because the information this study proposes to collect is unique and will address holes in scholarly literature.

A7. Special Circumstances

There are no special circumstances that apply to this data collection.

A8. Federal Register Notice

A 60- day notice was published in the Federal Register December 2, 2007, vol. 72, no. 232, page 68199, and a 30-day notice was published in the Federal Register August 6, 2008, vol. 73, no. 152, page 457941 to solicit comments on the study to Assess the Free Access to Computers and the Internet and to Related Services at Public Libraries on Individuals, Families, and Communities prior to submission of this OMB clearance request.

A9. Payment/Gift to Respondents

Two types of incentives are proposed for this study: (1) $20 cash to PAC users who participate in case study interview or focus groups; and (2) $200 to libraries participating as case study sites. Each incentive type is discussed below.

Case study libraries

Libraries agreeing to participate as case study sites will need to exert a fairly high level of effort to accommodate the researchers over a five-day period. They will assume the burden of releasing librarians and library administrators to participate in interviews and focus groups, provide background materials, notify library patrons of the research prior to the site visits, and answer staff and patron questions about the research. In consideration of this unusual burden, we feel the proposed incentive is warranted.

Case study interviews

The Information School of the University of Washington has past experience using payments to compensate participants for their time and insight in qualitative studies. Recent examples are the 2005 National Science Foundation funded “Talking with you” study which paid stay-at-home mothers $30 for taking part in 2-3 interviews (Fisher & Turner, 2005). Similarly, the 2007 Community Technology Centers Evaluation study paid users $10 for a 20 minute interview (Crandall & Fisher, 2007). Furthermore, payments were handed out equitably to all study participants, and per a human subjects guidelines, the amount was small and viewed as non-coercive to people deciding whether or not to participate in the study. Similarly, the current Study of Free Access to Computers and the Internet in Public Libraries is following the established history of the University of Washington to offer small, non-coercive compensation to participants for a 30 minute interview or 60 minute focus group session.

A10. Assurance of Confidentiality

Respondents to the telephone and participants in focus groups and interviews will be advised that the reports prepared for this study will summarize findings and will not associate responses with a specific individual, and that identifiable information will not be provided to anyone outside the research team, except as required by law. They will be provided with information about the benefits of participation at the beginning of the survey or interview. The research procedures have been reviewed by the UW Internal Review Board and comply with federal regulations regarding the protection of human subjects participating in academic research. Subjects will be at minimal or no risk of suffering stress, embarrassment or discomfort from this study. Children under age of 14 are not competent to give legal assent, thus they are ineligible to participate.

A11. Justification of Sensitive Questions

While the goal of the national telephone survey and the on-site interviews and focus groups is to identify, in very general terms, the type of information people access while using public access computers, there is a small subset of questions that ask respondents to report socio-demographic characteristics. Although this information may potentially be interpreted as sensitive, it is important for the agency to gather this data to determine whether and how these social characteristics are correlated with certain types of public access computer use. Confidentiality assurances will be given to all respondents and data will be secured in accordance with accepted social science practice (see section A10).

A12. Hour Burden for Collection of Information

The information requested in the telephone survey and the case study data collection efforts has been held to an absolute minimum required to answer the research questions and minimize the burden on the respondents and cooperating libraries. All respondents will be asked to provide demographic data to control for factors such as income and education level and assess the degree of age, gender, and ethnic diversity in the sample.

The estimated burden for the telephone survey is about 1,450 hours and $35,134.3 This is based on an average 15-minute survey completion time for each of the 1,130 public library computer users and an average of 2-minutes to screen 38,563 households (based on 30,637 from the RDD, 6,426 from the cell phone, and 4,500 from the nonresponse follow-up) to find these individual respondents. In addition, the estimated burden for telephone survey pretesting is about 60 hours and $1,454. This is based on an average 60-minute cognitive interview with 9 participants, a 15-minute survey completion time for 40 public library computer users, and an average of 2-minutes to screen 1,377 households to find these individual respondents.

38,563 HH x 0.03 hours = 1,157 hours

1,130 participants x 0.25 hours = 283 hours

1157 + 283 = 1,450 hours x $24.23 = $35,134

9 pretest participants x 1.0 hours = 9 hours

1,377 HH x 0.03 hours = 41 hours

40 participants x 0.25 hours = 10 hours

9 + 41 + 10 = 60 hours x $24.23 = $1,454


The respondent burden estimate for the case study interviews and focus groups is based on an expected respondent pool of approximately 160 library users, 40 staff of public libraries, and 40 additional stakeholders from local agencies that have referred the public to libraries for public access computer use, businesses, and other community providers of public access Internet resources. We anticipate approximately 8 focus groups of 5 users each, with the remainder of the user, staff and agency interviews being one-on-one. The estimated total hour burden for case studies is 140 hours and $3,300 based on 30-minute one-on-one interviews and 60-minute focus groups. The estimated cost burden for the interviews is $1,454 for individual library user interviews, $969 for focus groups, $4914 for public library staff, and $386 5 for additional stakeholders.

120 users x 0.5 hours = 60 hours

60 hours x $24.23 = $1,454


40 library staff x 0.5 hours = 20 hours

20 hours x $24.53 = $491


40 users x 1.0 hours = 40 hours

40 hours x $24.23 = $969


40 stakeholders x 0.5 hours = 20 hours

20 hours x $19.30 = $386

A13. Total Annual Cost Burden for Collection of Information

There are no capital and start-up costs, or annual operation or maintenance or purchase of services costs to respondents.

A14. Annualized Cost to Federal Government

The estimated annualized cost to the federal government of this data collection effort is $449,056.

A15. Program or Burden Changes

This is a new data collection.

A16. Plans for Tabulation and Publication of Collected Information

This data collection effort will commence immediately after OMB approval. We request that we receive approval for a 5-month data collection period from OMB. The timetable for key activities, demonstrating this need, is shown in Table 6.

Table 6: Project timeline


Activity

Number of weeks after OMB approval

Telephone survey


Pretest 1

Weeks 1-3

Select pretest respondents

Conduct pretest

Transcribe interviews

Analyze results

Revise survey




Pretest 2

Weeks 3-6

Field survey in field conditions

Review tapes and code

Analyze results

Revise survey




Field telephone survey

Weeks 6-16

Prepare survey system with approved questionnaires

Select sample and partition into replicates

Release sample in replicates

Separate tracking of cell-phone versus RDD sample

Conduct non-response follow-up study




Process data files and develop frequencies

Weeks 16-22

Process survey data (ongoing, throughout data collection)

Conduct nonresponse analyses (for weighting)

Develop analytic weights

Run series of cross-tabulations for analysis




Case studies


Conduct case studies

Weeks 1-12

Schedule and conduct site visits and focus groups




Process and analyze qualitative data

Weeks 12-20

Transcribe interviews

Code text

Run code frequencies for analysis




Final report


Develop and disseminate final report

Weeks 22-32

Draft outline for final report

Draft final report

Submit final report and briefing materials

Presentation of findings to IMLS



As discussed earlier in Section A.1, considerable work has been done in developing the survey instruments already through review with our advisory committee and independent consultants. Prior to fielding the instrument, however, we plan to conduct both cognitive testing in weeks 1-3 (pretest 1) and field testing in weeks 4-6 (pretest 2). The procedures for these tests are described more fully in Part B of this Supporting Statement. The results of the pretests, as well as modified instruments based on those results, will be summarized and provided to the OMB prior to initiation of the telephone survey in the field.

We anticipate the telephone survey will begin in late March, 2009, and take about 10 weeks to complete. The case studies will be conducted concurrently. Analysis of the telephone survey data will occur in June, 2009; analysis of the case study data will begin after the conclusion of the first field visit and continue through May, 2009.

The report of the survey findings will include the survey methodology and the quality of the data, a description of the sampling procedures, and a discussion of problems encountered in administering the survey, and a calculation of standard errors and design effects for the key survey variables for all analytic domains. Principal quality and design parameters such as screening and eligibility rates for the telephone survey and cooperation and overall response rates for all surveys will be reported; the raw data counts used to calculate these parameters will be reported, as well.

It is anticipated that a final report will be submitted to the IMLS by September, 2009. Copies of the report will be provided on the IMLS and University of Washington Information School websites after final approval and release by IMLS.

A16.1 Analysis plan

In order to identify key areas of public access computing used by individuals, families, and communities, a mixed method analytic approach, involving both quantitative and qualitative analyses, is warranted. Telephone survey data will be used to produce statistically generalizable findings, principally in the form of tabular analyses for the overall sample as well as for specific substantively important domains such as age groups, race/ethnicity groups, sex, household income groups and geographic areas. The site visit interview and focus group data will complement the statistical survey findings and rely on the rich conceptual information to help understand analytic findings and stimulate policy relevant insights. Administrative and program data, such as publically available data on program support for public access computers by different government agencies and private foundations, will be obtained during site visits. This information will be used to provide context and clarification of data collected.

The two data collection methods will yield a large amount of different types of evidence:

  • demographic data and responses to fixed-answer multiple choice questions that are easy to quantify,

  • short responses to open-ended questions that are easy to code, and

  • longer responses to in-depth interview questions and extensive field notes that will require iterative reading and multi-level coding.

Data analysis will comprise two phases: quantitative reporting of survey responses (in tabular form), and identification of themes in the qualitative analysis. The telephone survey data will provide a representative picture of the prevalence of different types of people using public access computers and how it benefits them. The case study data will provide a richer understanding of how users use public access computing and the role it fills in their everyday information environment, as well as a better understanding of how policy makers and funders use this information to make decisions about public access computing.

Stratified random sampling will increase the likelihood of achieving a representative sample. The survey data will also be weighted to match key census demographic control totals (by using the collection of screened households regardless of eligibility to participate in the PAC user survey). After taking into consideration analytic weighting and survey design effects, we expect that the sample size for the telephone survey (n=38,563 screened households and 1,130 users) will result in a margin of error of about +/-3.6% with a confidence level of 95%.6

For the telephone survey, we will conduct data-screening to check for data-entry errors, inconsistencies, and identify missing cases for any systematic bias. The final samples will be tested for internal reliability, co-linearity, and intra-class correlation to assess reliability of the operational variables and validity of the findings. Initial analysis will consist of running descriptive statistics for all variables to identify the center and distribution within the population and bivariate statistics (correlation and cross-tabulation) will be used to test for associations between variables. For variables where the investigators identify a possible causal relationship based on the qualitative evidence, we will conduct path analysis (multiple regression) to determine the proportion of variance that can be explained by the relationship.

For the qualitative data, we will analyze data as they are collected, following an approach that will aid in identifying a range of responses for each indicator (J. Lofland & L. Lofland, 1995; Miles & Huberman, 1994). The schemes will reflect the data’s emergent themes and will be guided by the study’s logic model. A code book will be used to assign terms to all segments in the data that reflect particular concepts. After the final schemes are developed, tests of intercoder reliability will be conducted with independent coders and final adjustments will be made to the codes (cf. Krippendorf, 1980).

To ensure trustworthiness (reliability and validity) of the qualitative data, we will use several measures (cf. Chatman, 1992; Lincoln & Guba, 1985). Reliability will be ensured through: (1) consistent note-taking, (2) exposure to multiple and different situations using triangulated methods, (3) comparing emerging themes with findings from related studies, (4) employing intracoder and intercoder checks, and (5) analyzing the data for incidents of observer effect. Validity will be assessed as follows:

  • Face validity: ask whether observations fit an expected or plausible frame of reference;

  • Criterion/internal validity (credibility) based on pre-testing instruments, rigorous note-taking, methods, peer debriefing, and member checks or participant verification;

  • External validity: provide “thick description” and comprehensive description of our methods so others can determine if our findings can be compared with theirs;

  • Construct validity: examine data with respect to public access computing outcome literature, models of public library use, and principles of information behavior

Data from the case studies and telephone survey will also be analyzed in a mixed method framework to seek out areas of convergence, corroboration, and correspondence as well as the discovery of divergent themes (cf. Green, Caracelli, & Graham, 1989; Jick, 1979). The mixed method analysis will also help elaborate and enhance the quantitative data with contextual information important for identifying policy and resource variables that influence PAC user outcomes. This methodology will also help account for the particular strengths and biases of the quantitative and qualitative methods employed in the study and lead to greater construct validity.

As discussed more extensively in Section A.1, this study is exploratory in nature, and not intended to identify causality. It is a first step toward defining important indicators related to public access computing, and will be the foundation for later, more targeted work in this area. The mix of methods we are employing in this study are designed to provide a substantive quantitative picture of PAC usage in particular domains by specific populations, along with a rich set of qualitative insights into that use from both an agency and user perspective. We believe that this will generate specific research questions that others can explore using the data collected in this study as a baseline reference.

A16.2 Data presentation

The preliminary report will include an executive summary, literature review, statement of methodology, analysis, report on findings, and recommendations (including descriptions of how the results can be used by practitioners to measure the their public access computer services, improve the services offered and to understand how the complex array of decisions they make about public access computer work as a whole to serve individual users and the community). The final report will include complete technical information: instruments, full data summaries, and detailed description of methodologies used. The synthesis will summarize the background and findings in a national context to be informative to a broad policy and planning audience. The results will also be disseminated through professional and academic conferences and journals.

A17. Expiration Date

The OMB approval number and expiration date will be displayed on all survey instruments and discussion guides.

A18. Certification Statement

There are no exceptions to the certification.

References

Abrahamson, J. A., Fisher, K. E., Turner, A. G., Durrance, J. C., & Turner, T. (2008) Lay information mediary behavior (LIMB) uncovered: How non-professionals seek health information for themselves and others online. Journal of the Medical Library Association, 96:4, 310-323.

Bertot, J., McClure, C. R., Jaeger, P. T., & Ryan, J. (2006). Public libraries and the Internet 2006: Results and Findings. Information Use Management and Policy Institute, College of Information, Florida State University.

Blumberg and Luke. (2008). Wireless Substitution: Early release of estimates from the National Health Interview Survey, July – December 2007. Washington, DC: NCHS, CDC.

Chatman, E. A. (1992). The information world of retired women. New York: Greenwood Press.

Chute, A., & Kroe, P.E. (2007). Public Libraries in the United States: Fiscal Year 2005 (NCES 2008-301). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.

Crandall, M. D. & Fisher, K. E. (2007). Baseline Data Collection and Analysis for CT Support (work in progress). Funded by Bill & Melinda Gates Foundation under Grant # 42646.

Estabrook, L. S., Witt, G. E., & Rainie, H. (2007). Information searches that solve problems How people use the internet, libraries, and government agencies when they need help. Washington, DC: Pew Internet & American Life Project. Available at http://www.pewinternet.org/pdfs/Pew_UI_LibrariesReport.pdf.

Fisher, K. E., & Turner, A. M. (2005). Stay-at-home mothers. Connecting Research and Practice: Special Populations. 5th Annual Research Symposium of the Special Interest Group on Information Needs, Seeking, and Use (SIG USE) of the American Society for Information Science and Technology. October 29, Charlotte, NC

Glander, M., & Dam, T. (2006). Households’ Use of Public and Other Types of Libraries: 2002 (NCES 2007-327). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubsearch.

Green, J, Caracelli, V., & Graham, W. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11:3, 255-274.

Griffiths, J. and King, D. (2008). Public library survey results. InterConnections: The IMLS national study on the use of libraries, museums and the Internet. Washington, DC: Institute of Museum and Library Services. Retrieved from http://interconnectionsreport.org/reports/library_report_03_17.pdf.

IMLS. (2007). Program solicitation for a cooperative agreement to assess the impact of free access to computers and the internet at public libraries (Funding Opportunity No. GF-CA-07). Available at http://www.imls.gov/pdf/ComputerAccessStudyRFP.pdf

Jaeger, P. T., Bertot, J. C., & McClure, C. (2007). Public libraries and the Internet 2006: Issues, findings, and challenges. Public Libraries, 46 (5), 71-78.

Jick, T. (1979). Mixing qualitative and quantitative methods: triangulation in action. Administrative Science Quarterly, 24:4, 602-611.

Krippendorff, K. (1980). Content analysis: An introduction to its methodology. Newbury Park, CA: Sage

Lampkin, L. , Winkler, M., Kerlin, J., Hatry, H., Natenshon, D., Saul, J., et al. (2006). Building a common outcome framework to measure nonprofit performance. Washington, D.C.: Urban Institute. Available at http://www.urban.org/publications/411404.html.

Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park, CA: Sage.

Lofland, J., & Lofland, L. H. (1995). Analyzing social settings: A guide to qualitative observation and analysis. Belmont, CA: Wadsworth.

Meyers, E., Fisher, K. E., & Marcoux, E. (2007). Studying the everyday information behavior of tweens: notes from the field. Library & Information Science Research, 29.3, 310-331

Miles, M. B., & Huberman, A. M. (1994). Qualitative analysis. Thousand Oaks, CA: Sage.

Naumer, C., Situated Logic Model: Using the Model in the Real World (forthcoming), in Information and Community Technology: Identifying Local and Global Impact, M. Crandall and K.E. Fisher, Editors. 2009, Information Today: Medford, N.J.

Van Den Berg, R. (2005). Results evaluation and impact assessment in development co-operation. Evaluation, 11, 27-36.

Zweizig, D., & Dervin, B. (1977). Public library use, users, uses: Advances in knowledge of the characteristics and needs of the adult clientele of American public libraries. Advances in Librarianship, 7, 231‑255.



1 In late 2007, an estimated 15.8% of U.S. households were reported to have cell-phone-only access.

2 In 2002, 8.9 percent of households were reported to have used a computer or the Internet in a public library in the past month.

3 Based on 2006 Median household income ($48,451) from http://www.census.gov with an hourly rate assuming 40 hours/week, 50 weeks/year.

4 $49,060 is the 2006 median annual earning of librarians from http://www.bls.gov/oco/ocos068.htm.

5 $38,590 is the 2007 mean wages of community and social services specialists from http://www.bls.gov/news.release/pdf/ocwage.pdf

6 This represents the half-width of a 95% confidence interval of an estimated percentage near 50%, assuming one person per eligible household is selected, a weighting effect (i.e., Design effect due to differential weighting) of 1.5 and a nominal sample size of n=1,130; so that approximately, 3.6% = 1.96 x Sqrt[(1.5 x 0.25)/1,130].

IMLS | 25

File Typeapplication/msword
File TitlePurple highlights indicate an OMB question
SubjectRevised per IMLS
AuthorSamantha Becker
Last Modified Byllanga
File Modified2009-02-20
File Created2009-02-18

© 2024 OMB.report | Privacy Policy