OMB Passback questions #2

Response 2 to Questions from OMB - PAC study.docx

The Study of Free Access to Computers and the Internet in Public Libraries

OMB Passback questions #2

OMB: 3137-0078

Document [docx]
Download: docx | pdf

Institute of Museum and Library Services

The Impact of Free Access to Computers and the Internet in Public Libraries

Response 2 to Questions from OMB

ICR Reference No. 200808-3137-001


The following is in response to OMB’s concerns outlined in the November 28, 2008 email. We have maintained the order and numbering of the questions.


Web survey

In consideration of OMB’s concerns with the proposed web survey and the constraints on our budget and time line, we have withdrawn the web survey from the study.


The web survey was to have provided us with an increased sample size to be used to boost the response rate in conjunction with the telephone survey, as well as add responses from hard to reach populations such as youth and the homeless. As a result of dropping the web survey, we have adjusted the sample size of the telephone survey to compensate for the loss of additional responses from the web survey. The new telephone survey sample size is discussed in Section 1, below.


1. Telephone survey & interview guides

Telephone survey sample size

We have increased our goal for the number of completed telephone survey interviews from 760 to 1,130, which includes 850 from the household sampling frame and 160 from the cell phone exchange sampling frame, as well as 80 interviews from the nonresponse follow-up study described in section B2.1 of the original OMB clearance package. This increased sample size is expected to reduce our margin of error from +/- 4.5% to approximately +/- 3.5% corresponding to a 95% confidence interval around a mean of 0.5. We estimated the starting sample of households assuming that 8.9% of respondents would say yes to the primary screening question 1 and also taking into consideration survey nonresponse. This sample size is consistent with other library use studies, including the recent IMLS Interconnections study2 (Griffiths & King, 2008).


Telephone survey pre-testing

Where possible, we used questions previously fielded by the Census Bureau. However, many of the questions on this survey are new and have not yet been validated through other research. As discussed in our response to question 2 below, since this is a research study in an area that has not been investigated extensively, we have been careful to develop questions validated by review of experts, and focused on specific areas of interest related to the impact of public access computing in public libraries identified in previous work (e.g., Urban Libraries Council, 2007). We believe that one of the results of this study will be the provision of a clearer pathway for other investigators, and significant learning about what types of questions and impact areas will provide the richest results in the future.


We made several revisions to the survey based on the OMB’s comments and also performed limited pre-testing with the web survey which revealed some cognitive and procedural issues which have been incorporated into the attached telephone instrument where applicable. Nonetheless, further testing will be conducted prior to the launch of the telephone survey, which will include the following:


  • In-person cognitive interviews with volunteer subjects (primarily graduate students and employees of the telephone survey contractor, TCI). This will include interviewer-guided discussion of areas of confusion or ambiguity, with probing for details as appropriate for each question to ensure the question is understood. For example, with the question, “Have you used the library computer or Internet connection to maintain a personal website?” we would probe the subject’s understanding of “maintaining” and “personal website”.

  • Telephone pre-testing with up to 10 qualified subjects which will include discussion of questions about which the subject is uncertain. Interviewers during this stage will code answers and record any issues with the question, such as a request from the respondent for clarification

  • Debriefing of interviewers. Interviewers will be monitored in real time by a TCI evaluator who will code answers and record any issues observed during the telephone testing and suggest changes. Interviewers will be debriefed by TCI supervisors.


Taken together with the previous expert reviews, these generally accepted methods should identify any remaining semantic or organizational problems with the survey (Presser & Blair, 1994). Results will be used to further refine the survey instrument prior to the launch of the first replicate.


In addition, we carried out field testing with our case study instruments and modified those to reflect our experience, as well as addressing the specific questions raised by OMB. This is discussed further in 1g-m below.


Telephone survey instrument

  1. Please clarify what you anticipate respondents are comparing to in the questions that ask whether “above activities” save you time and money or reduce frustration.   What is the specific purpose of these questions?  As written, we suggest deleting, as they do not appear likely to produce meaningful, objective responses. 


IMLS: The purpose of these questions is to attempt to measure some subjective outcomes from PAC use. However, we agree that the questions are ambiguous and have dropped them from the survey. Instead we have added two Likert scale questions (E3 and E4) to measure the importance of and satisfaction with PAC resources, as well as one open-ended question (E5) to elicit suggestion for PAC improvements.


  1. We are concerned about potential bias in asking the “specific reason you didn’t use the library” in the negative.  Please justify.


IMLS: Although questions worded in the negative are generally to be avoided in questionnaire design, these specific questions are asked in follow-up to a negative response to the domain screening questions and simply reiterate the response; they do not pre-suppose an answer. The purpose of these questions is to provide the researchers with more information about the user’s reason for not using PAC resources, specifically whether it was because there was no need or if the library was avoided for some reason having to do with the specific domain under consideration. We considered several other approaches to this follow-up question, however we felt that using an open-ended elaboration question was the most efficient and effective.


  1. Please clarify why you require such detail as “on-line banking,” “investment information” and “earn about products or services” as separate questions?  Will the survey has sufficient power to separately report these subcategories?


IMLS: These specific questions are about substantively different activities… The power of the survey has been increased in consideration of dropping the web survey… However, even with the increased number of completed interviews, we anticipate that some of these detail-level questions will not receive sufficient responses to allow separate reporting. When considered in conjunction with the case study interviews, we feel that even without statistical power, these questions will be valuable for framing future studies, one of the primary purposes of an exploratory study such as this one in a new area of inquiry.


  1. Please consult OMB statistical standards and revise the Hispanic Origin and Race questions accordingly (e.g., other is not permitted; must offer 1 or more race).


IMLS: The Hispanic origin and race questions have been revised in accordance with OMB standards.


  1. DP1 could be interpreted to be a ‘double barreled’ question (i.e., asking 2 things in 1 question—computer use or other reasons) and should be improved.  We also are unsure about the reason for this question.


IMLS: The question has been reworded and split into three questions for parents and one question for teens 14-18:


U5: [If over 18 years old] Do you have any children under 18 years old in your household?

[If YES, qualify PARENT] U5.1 Do(es) your child(ren) use the public library’s free Internet computers?

[If YES] U5.1.1 Have you allowed your child(ren) to use the library computers without your supervision?


U6: [If 14-18 years old] Have you used the computers at the library without your parents or a guardian?


This is a general use question aimed at measuring the extent of children using library computers without parental supervision. This is an issue of major concern to librarians and library policymakers; the Federal government also has an interest in this phenomenon as demonstrated by library Internet filtering policies enacted with the passage of the Child Internet Protection Act in 2000 and specifically aimed at children using library computers without adult supervision.


We recognize that this rewording could be perceived as a negative or biased, however other approaches to the question increased its ambiguity and risked not measuring the phenomenon of interest.


  1. Q11 (part 4) – What is the rationale for “might use” rather than “use” in this question?


IMLS: The question has been reworded as follows:


E1: Have you used a library computer or Internet connection for any other purposes than the ones we have asked you about? (open-ended)


User interview guide

Please note that the interview guides are designed to be used by the IMPACT project researchers and are not intended to be seen by the respondents, so the questions are guides for the interviewer to be used in combination with their prior training for the study. We have attached revised guides incorporating improvements based on our field testing and responses to the OMB questions below.


  1. We note multiple typos in the interview guide (eg, Q 1, 14).


IMLS: The interview guides have been proofread and edited.


  1. The meaning of Q 15 is unclear.


IMLS: The question has been renumbered and is now Q14. We have added some additional examples to help clarify the meaning:


14. Does anything make it difficult or prevent you from using the free computers and the Internet? (e.g MySpace, YouTube, time limits, number of Internet computers, library policies, etc.)


The purpose of this question is similar to the open-ended elaboration questions discussed in b., above. We are attempting to differentiate between intrinsic and extrinsic barriers to PAC use, and specifically whether certain types of common library policies have an impact on the perceived usefulness of PAC resources. For example, some libraries prohibit accessing YouTube.com on PAC terminals; in the instance of our case study pilot site, the decision to block YouTube was driven by a lack of Internet bandwidth (users viewing videos caused decreased system performance for other users). If PAC users frequently report this type of policy as a barrier to their use of the library’s resources, it may help libraries prioritize their resource allocations (e.g. investing in greater bandwidth) or reexamine their policies.


  1. Q 28, it is not acceptable to ask race/ethnicity as an open ended question.


IMLS: The race/ethnicity questions have been revised in accordance with OMB standards.


Funding agency interview guide

  1. Q1.  Is IMLS looking for a formal, written vision statement?   As written, unclear.


IMLS: The question has been rewritten for clarity as follows:


2. Does your agency/city have a formal written vision or strategic plan for digital inclusion in your community? If yes, how does the library fit into it? (digital inclusion defined as access, content and services)


  1. Q 14.  How would agency know something is “unique” as this would require verifying that no one else is doing it?  How would IMLS validate this notion?


IMLS: The word “unique” has been removed from the question.


Library administrators instrument

  1. Q 12 and 13, why not ask for a “typical” example rather than “favorite,” which seems like to be atypical by definition.


IMLS: Q11, Q12, Q13, & Q14 have been reworded to ask for a “typical” story.


  1. Q 23, what is this question asking?


IMLS: The question has been rewritten as follows:


Q23. What is your personal vision for improving the free access to computers and the Internet in your library?


2. Developing indicators

While we understand how the “situated logic model” may assist in validating that the variables of interest are relevant, we do not see how the study will actually quantify the indicators


IMLS: As discussed in our previous response, we believe that there is a complicated relationship between public access computing and the commonly used measurements for change in individuals, families and communities that may be influenced by experiences with PAC. The situated logic model provides a framework which allows us to relate relatively simple indicators to relevant policy efforts, through bridging between the outputs or outcomes related to PAC being measured and the broader changes measured in policy work.


Since little previous work has been done on the direct impact of PAC, and this study is meant to provide a beginning point for exploration of this area, we feel that it is important to examine multiple domains that may be affected by PAC that are of interest to the policy community. We developed the survey questions based upon the input of our expert committee and a review of prior work in this area, and believe that even though the questions are simple, they will provide good indicators of where future work might be profitably be undertaken. Getting some specific data about activities in the areas identified of interest within the policy domains will be a first step toward understanding whether PAC is contributing to changes in these areas.


The questions have been carefully chosen so that we can begin to gather metrics about how users are interacting with PAC, and what sorts of activities they are actually accomplishing within the domains of interest to policy makers. Our questions go beyond asking if the training took place to asking about whether a concrete result which that training may have contributed to occurred—moving from an output measure to an outcome measure. That in turn can be related to the broader policy indicators using the principles outlined earlier in our situated logic model.


The indicators themselves are representative of the kinds of activities that have been shown to contribute to changes in the impact areas identified within the domain. We do not expect to show causality, but would hope to show contribution, as described in Van Den Berg’s recent exploration of this area: “…in the case of results at the level of society, the public debate should move from the concept of linear causality to the concepts of conditionalities (necessary but not sufficient conditions for change to occur). Furthermore, it should be made clear that these necessary but not sufficient conditions contribute to rather than cause the change to take place.” (Van Den Berg, R. , 2005).


The results from the survey will also be examined in the light of our qualitative case study data, not necessarily from the “staff” perspective, but the “user” perspective and the “funders” or “stakeholders” perspectives. One of the strengths that we gain by including these qualitative explorations of the user’s and stakeholders own estimation of the impact on themselves and their community is the ability to understand areas where future investigations may be fruitful, and a better grasp of the conditionalities that might be important to consider when the survey results are interpreted. Clearly, additional work will be needed to validate these insights, but without the initial investigation, there will be no place to start.


A recent report by the Urban Institute (2006) lays out the approach we are proposing fairly well. The chart below, taken from this report, shows that the development of solid indicators for areas of interest to an organization’s work is a multi-stage process, and we feel that this study is at the very beginning of this process. Until we have data that can show that particular indicators are worthwhile pointers to contribution in specific policy domains, there is little that can be done to advance beyond the anecdotal approach that is the best we have at the present time for discussing PAC impact.


Figure 1: Outcome sequence chart: creating a common framework for measuring performance

Shape1


A graphical representation of how we are using the instruments in this study is shown below (Figure 2), with an indication of how the two approaches will help us to understand different areas of PAC and the ways that the results will inform the development of indicators related to intermediate and end outcomes (or outcomes and impacts).

Shape2

As seen in Figure 2, the telephone survey links User Activities to Intermediate Outcomes. Used in this way, the telephone survey will mostly quantify user activities (at a level we can expect to see statistical significance based on our sample size). To a lesser extent, it will provide statistical evidence of intermediate outcomes, especially those related to more common activities such as the link between using email and keeping in touch with friends or family. We do not expect that the telephone survey will be able to reliably find indicators of end outcomes such as social inclusion. The telephone survey will not be able to link library inputs to activities or intermediate outcomes.


The case study interviews and focus groups link User Activities to Intermediate and End Outcomes. The case studies are particularly useful for identifying evidence of end outcomes which are subjective (e.g. quality of life). The case studies also aid in the confirmation and comprehension of quantitative associations, capturing levels of subjective importance, and revealing other questions that should be asked in subsequent evaluations


3. Pew study

IMLS indicated that the 2nd finding of the cited 2007 Pew study related to 18-30 year olds was a major driver in its decision to survey 14-18 year olds, and therefore conduct the online survey at libraries.  A closer examination of the results from the Pew study show that the percentage of 18-30 year olds who use libraries for help in solving problems is statistically no different from 31-42 year olds and 43-52 year olds.  The study also indicated that adults with at home children were more frequent library users, and that the library was used more often for education related research than for other types of “problems.”  Taken together with OMB’s concerns about the viability of the Internet survey, and the enormous sample size being proposed to overcome design effects, etc., we would like IMLS to reconsider the need for the web survey, and consider instead focus groups with library users in the demographic groups of interest.


IMLS: The web survey has been dropped from the research design. That said, our aim of including youth ages 14-17 in the study was predicated on research beyond the PEW study. Meyers, Fisher and Marcoux (In Press; 2007) extensively reviewed cognate literatures regarding minors and how they seek information/use library technology for everyday life situations. They concluded that this research phenomenon has been rarely studied, and that the vast majority of studies focused instead on youth and their scholastic needs. Moreover, studies of youth were lacking adequate conceptual and methodological frameworks. There are several reasons why minors have been studied less frequently than adults by academic researchers, a primary one being human subjects or Internal Review Board (IRB) constraints at institutions of higher education where special processes and training is required of all researchers engaged in studying protected populations such as minors. The PEW study, itself, did not include persons under the age of 18 and therefore its results cannot be generalized to younger populations. Indeed research on minors shows that distinctions exist among different age segments therein. For example, early tweens differ from late tweens from mid teenagers and so forth. An accepted finding amongst youth researchers is that generalizations regarding adults cannot be summarily applied to minors. To ensure that we gather insights into the 14-17 year old population, we will be relying heavily on the case studies.


4. Goals of the study

How will IMLS policies be better informed from this study?


IMLS: The Institute will use this study to fulfill its statutory mandate to analyze trends and share best practices to improve library services.  The study results and performance indicators will be disseminated to grantees and potential applicants in order to improve future grant applications and as tools to enhance program development at the local level.   A recent analysis of the agency's Library Grants to States Program shows that libraries throughout the nation use federal funds to develop technology infrastructure to support a range of activities that strengthen communities such as providing literacy programming for adults and children, offering homework help, and purchasing access to databases and indexing services that cover a wide range of substantive areas. The instruments developed through this research project could be used by state and local libraries to assess user need and make better resource allocation decisions, thereby making better use of public investments.

The information will also help the agency developed tailored technical assistance modules on research and program evaluation. IMLS convenes an annual technical assistance training conferences for state level administrators of the Library Grants to States program and for state data coordinators and develops asynchronous web training on program evaluation for the library community.  These convenings provide library administrators and program development staff with tools for program evaluation and monitoring in their home states. The results of this study, which focuses more directly on the user experience with PACs in libraries, will be used to highlight the importance of user-centered metrics in the evaluation of library service, rather that administrative metrics such as circulation counts, PAC user number of PAC user sessions, etc.


5. Future research

Where does the IMLS plan to go next with this study? Will it be used to inform and base future studies off of?


IMLS: The Institute plans to promote study findings widely among library and researchers in other fields and encourage future studies through its National Leadership Grant program.  Findings regarding use area will provide information needed to develop more strategic partnerships with other national organizations. For example, use of library PAC resources for employment and health information services would be noteworthy to other government agencies such as the Department of Labor, which sponsors One Stop Employment Centers and the Center for Disease Control and Prevention which invests in health information campaigns across the country.


6. Incentives to respondents

Please specifically justify the proposed payment amounts for each set of participants in the study.


IMLS: Three types of incentives are proposed for this study: (1) a $10 gift certificate for telephone survey respondents; (2) $20 cash to PAC users who participate in case study interview or focus groups; and (3) $5,000 to libraries participating as case study sites. Each incentive type is discussed below.


Telephone survey respondents

According to the 2002 CPS study, public access computer users are a relatively small proportion of the general public and will require a large number of phone calls to reach the sample size. The payment of incentives to this population will improve coverage and reduce survey costs by increasing survey completion rates.


Case study libraries

Libraries agreeing to participate as case study sites will need to exert a fairly high level of effort to accommodate the researchers over a five-day period. They will assume the burden of releasing librarians and library administrators to participate in interviews and focus groups, provide background materials, notify library patrons of the research prior to the site visits, and answer staff and patron questions about the research. In consideration of this unusual burden, we feel the proposed incentive is warranted.

Case study interviews

The Information School of the University of Washington has past experience using payments to compensate participants for their time and insight in qualitative studies. Recent examples are the 2005 National Science Foundation funded “Talking with you” study which paid stay-at-home mothers $30 for taking part in 2-3 interviews. Similarly, the 2007 Community Technology Centers Evaluation study paid users $10 for a 20 minute interview. Furthermore, payments were handed out equitably to all study participants, and per a human subjects guidelines, the amount was small and viewed as non-coercive to people deciding whether or not to participate in the study.


Similarly, the current Impact of Free Access to Computers and the Internet in Public Libraries study is following the established history of the University of Washington to offer small, non-coercive compensation to participants for a 30 minute interview. The “extremely eager” pre-test respondents noted in the OMB’s question were incentivized.


7. Who is the “center”?

IMLS: The “center” is the Social Development Research Group discussed in A8. They are an affiliate of the University of Washington’s School of Social Work and are the contractor for the web survey. Since we have excluded the web survey from OMB consideration, they will no longer be involved in this research project.


8. Justification for number of case study interviews

The bi-fold aim of the case studies is: (1) to provide insights into the study’s research questions that are not amenable to quantitative investigation per the telephone survey (Table 1), and (2) to provide greater context/depth for the telephone survey questions. To accomplish these aims it is necessary to conduct case studies in 5 communities such that our interviews and data analysis reflect the fullest range of outcomes associated with free access to computers and the Internet with varied stakeholder groups across the country. To this effect, we have selected five libraries in different geographic regions of the country: Marshalltown Public Library, IA, Enoch Pratt Free Library, MD; Fayetteville PL, AK; Oakland PL, CA; and Apache County PL, AZ. These libraries are located in demographically diverse, low income communities of varying sizes and all meet demonstrable need and use of library computers by their stakeholders. Moreover, they were all recommended for study by leaders of the library community and/or received awards for their services in providing the public with access to computers and the Internet. Due to differences in the nature of libraries, library services, and the very nature of communities themselves we cannot assume that conducting only 2-3 case studies will meet our research needs. Related IMLS-funded research by Durrance and Pettigrew of how public libraries participate in online community networking initiatives involved study of three major library systems: North Suburban, IL; Multnomah County, OR; and Pittsburgh Public Library, PA (2002; Pettigrew, Durrance & Unruh, 2002). Durrance and Fisher’s (2005) IMLS-funded research to develop an outcomes toolkit for evaluating public libraries’ community services involved case studies of 5 library systems (King County, WA; Queens Public Library, NY; Flint Public Library, MI; Peninsula Public library System, CA; and Austin Public Library, TX). The Libraries Connect Communities study (ALA, 2007) of how technology is funded in libraries and how the public uses it involved site visits to 30 libraries in New York, North Carolina, and Pennsylvania, and Virginia.


Within each of our 5 case studies we anticipate interviewing a maximum of 50 library users (ages 14 and up); 5-10 library staff, board members and volunteers; and 5-10 local agency staff and policy makers/elected officials; and 5-8 local merchants/employees as part of our community walks—for the purpose of identifying ripple effects of the library’s public access computing services and stakeholder perceptions. These numbers are consistent with the numbers of subjects interviewed in the above referenced studies as is the range of interviewee types. For users, in particular, the quota can be quickly reached if a focus group(s) is conducted in lieu of individual interviews. At the larger systems (i.e., Baltimore, Oakland) where two branch libraries will be included for study in addition to the central building, the maximum number of subjects will be sought.


Table 1: IMLS Research Question (as posed in IMLS RFP); Constraint(s) and Recommended Method(s) by UW Research Team



IMLS Research Question



UW Identified Constraint(s)


UW Recommended

Method(s)

1) What are the demographics of people who use computers, the Internet, and related services in PLs?

Difficult to identify target population, high eligibility requirements

Telephone Survey

2) What information and resources provided by free access to computers, the Internet, and related services in PLs are people using, across the spectrum of on-site and off-site use?

Confounding—difficult to identify individual user from usage summaries

Telephone Survey

3) How do individuals, families, and communities benefit (with a focus on social, economic, personal, and professional well-being) from free access to computers, the Internet, and related services at PLs?

Difficult to identify causal mechanism from correlated survey data

Requires extended access to broad range of stakeholder groups

Telephone Survey

Case Study

4) What reliable indicators can measure the social, economic, personal, and/or professional well-being of individuals, families, and communities that result from access to computers, the Internet, and related services at PLs?

Low repetition of outcome indicators across previous studies

Requires development and testing of underlying logic model

Telephone Survey

5) What correlations can be made between the benefits obtained through access to computers and the Internet and a range of demographic variables? What correlations can be made to type, level, or volume of related services?

Requires a large, representative sample stratified by socio-economic, demographic variables

Telephone Survey

6) What computer and Internet services and resources are lacking at PLs that, if available, could bring about greater benefit?

Requires extended access to broad range of stakeholder groups

Requires asking open-ended questions

Case Study

7) What indicators of negative impact can be identified where free access to computers and the Internet is weak or absent?

Difficult to identify target population

Difficult to identify root causes

Requires asking open-ended questions

Case Study


9. Overall response rate calculation

Please clarify how you obtained the “overall response rate” calculations for the RDD survey in table 1 in Part B.


IMLS: The response (completion) rate was calculated as a function of the number of expected completed interviews divided by the sum of expected completed interviews plus the estimated number of eligible respondents whose household status and eligibility status were unknown.3 The revised disposition tables below include these intermediate calculations not shown in the original OMB clearance application; they also reflect the change in sample size previously discussed.


Table 2: Expected disposition for PAC user HH phone survey

Shape3



Table 3: Expected disposition for PAC user cell phone survey

Shape4


10. Interviews and focus groups

Please clarify if respondents in libraries are being asked to participate in a one on one interview, a focus group, or both?  Why?


IMLS: Respondents will be asked to participate in a single, one-on-one interview or one focus group. Libraries differ from one another and issues such as time or space may influence which procedure is used. For example, if time does not permit multiple one-on-one interviews, then a single focus group would be used to accommodate data collection from multiple people. In a like manner, should there be insufficient space for a focus group, respondents would be asked to participate in one-on-one interviews. The types of situations we may encounter in the libraries cannot be determined in advance, therefore, our methodology is to cover possible contingencies by including both one-on-one interviews and focus groups when visiting the various sites. However, we do anticipate conducting one on one interviews with staff, both one on one interviews or focus groups with adult library users, and one-on-one interviews or focus groups with youth.


11. Federal sponsorship


IMLS: The following statement has been added to all interview guides and consent forms.


We are asking you to be in a research study being conducted by the University of Washington’s Information School and sponsored by the US government’s Institute for Museum and Library Services.


12. Pledge of confidentiality

Please alter the pledge of confidentiality in the consent forms and interviewer scripts to include “except as provided by law.”


IMLS: The phrase “except as provided by law” has been added to all interview guides and consent forms.


13. Telephone script

Is the “DIRECT USER instrument” the telephone interviewer script?  If so, is it IMLS’s intention that the interviewer will read it verbatim, including the URL?


IMLS: Yes, the “DIRECT USER instrument” is the telephone survey script and has been renamed as such. It is our intention to have the interviewer read the script verbatim, including the URL of the project website.



References

American Library Association, & Florida State University. (2007). Libraries connect communities: Public library funding & technology access study, 2006-2007. ALA research series. Chicago: American Library Association.


Durrance, J. C., & Fisher, K. E. (2005). How libraries and librarians help: A guide to identifying user-centered outcomes. Chicago: American Library Association.


Durrance, J. C., & Pettigrew, K. E. (2002). Online community information: Creating a nexus at your library. Chicago: American Library Association.


Estabrook, L., Witt, E., & Raine, L. (2007). Information that solves problems: How people use the internet, libraries, and government agencies when they need help. Washington, DC: Pew/Internet & American Life Project.


Pettigrew, K. E., Durrance, J. C., & Unruh, K. T. (2002). Facilitating community information-seeking using the Internet: Findings from three public library-community network systems. Journal of the American Society for Information Science & Technology, 53.11, 894-903.


Meyers, E. M., Fisher, K. E., & Marcoux, E. (In Press). Making sense of an information world: The everyday life information behavior of tweens. The Library Quarterly.


Meyers, E. M., Fisher, K. E., & Marcoux, E. (2007). Studying the everyday information behavior of tweens: Notes from the field. Library & Information Science Research, 29.3, 310-331. (Received the 2008 American Library Association Jesse H. Shera Award for Distinguished Published Research)


Presser, S. & Blair, J. (1994). Survey pretesting: do different methods produce different results? Sociological Methodology, 24: 73-1094.


Urban Institute, & Center for What Works. (2006). Building a common outcome framework to measure nonprofit performance. Washington, D.C.: Urban Institute. http://www.urban.org/publications/411404.html

Urban Libraries Council. (2007). Making cities stronger: Public library contributions to local economic development. Evanston, Ill: Urban Libraries Council.

Van Den Berg, R. (2005). Results Evaluation and Impact Assessment in Development Co-operation. Evaluation. 11 (1), 27-36.


1 Q5: Have you used a computer in the public library to access the Internet in the last year?

2 n=1,049

3 890/(890+182+1991+340)=.26

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSamantha Becker
File Modified0000-00-00
File Created2021-02-04

© 2024 OMB.report | Privacy Policy