ERS REIS Supporting Statement - Part A (Revised)

ERS REIS Supporting Statement - Part A (Revised).docx

Rural Establishment Innovation Survey (REIS) (Also Known as National Survey of Business Competitiveness)

OMB: 0536-0071

Document [docx]
Download: docx | pdf


Supporting Statement

Revised 5/6/2014

U.S. Department of Agriculture

Economic Research Service

Rural Establishment Innovation Survey (REIS)

OMB Control No. 0536-XXXX


Part A - Justification


Question 1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


The Economic Research Service plans to conduct the Rural Establishment Innovation Survey (REIS) as a one time inquiry. The proposed collection will contribute to a better understanding of how international competition and the increasing knowledge intensity of economic activity in the U.S. are affecting the vitality of rural areas—and effective adjustment to these pressures. Data obtained from the survey will allow us to examine the prevalence of innovative activity in nonmetropolitan businesses and those establishment and community-level characteristics associated with innovation. The results will inform the degree to which human capital endowments; access to credit; access to infrastructure; and potentially more limited interaction with suppliers, customers, and peer firms impedes processes of rural innovation. By sampling metropolitan establishment for comparative purposes, the survey will provide concrete information on a topic plagued by informed speculation. The data collected through this survey will help fill serious gaps in our understanding of what rural establishments need to be competitive in the national and global economies and provide USDA and other policymakers with sound information so they can craft more effective rural development policies. The legal authority for collecting this information is the Rural Development Act of 1972 7 U.S. C. 2662 (b) (Attachment I).


White House initiatives in the past couple years have stressed the critical importance of innovation to national prosperity. President Obama described “encouraging American innovation” in the 2011 State of the Union address as “[t]he first step in winning the future.” A Strategy for American Innovation: Securing Our Economic Growth and Prosperity (http://www.whitehouse.gov/innovation/strategy) outlines three critical focus areas, all of which raise questions about the potential contribution of rural America. Are rural areas falling behind in the availability of workers with 21st century skills and digital infrastructure? Are rural firms at the forefront of the clean energy revolution? How important are entrepreneurial ecosystems to rural start-ups and do rural firms access spatially discontinuous networks to mitigate the disadvantages of remoteness? REIS will provide the first nationally generalizable information on these issues.


Question 2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.]



USDA researchers will use the data generated by this survey to directly address questions about rural innovation rates and the factors associated with innovative activities needed to remain competitive in a global economic environment. Innovation is defined as a change in a product offering, service, business model or operations which meaningfully improves the experience of a large number of stakeholders resulting in an increase in economic value.0 These data, unavailable from any other source, will enhance the ability of the Economic Research Service to answer questions on the advantages and disadvantages of rural areas and establishments in tradable sectors facing rapidly changing economic conditions. These data will enable the Economic Research Service to provide information to the administration; USDA; other Federal agencies with a keen interest in the topic such as the Small Business Administration, Commerce Department, National Science Foundation; State Rural Development Councils; and the Congress.


To ensure adequate coverage of new and small establishments that are critical to understanding innovation, the survey will use the business establishment list of the Quarterly Census of Employment and Wages (QCEW) maintained by the Bureau of Labor Statistics (BLS) as its primary sampling frame. Because the QCEW program is a Federal-State partnership, access to the sampling frame will be limited to those states that grant approval. A proprietary business list frame will be used for states not granting approval.


Information will be collected using a multi-mode survey where respondents will be able to complete the survey via telephone interview or by either completing a questionnaire sent via mail or available electronically through a secure web link. The survey distributed to respondents will be titled “National Survey of Business Competitiveness” because including “Innovation” in the title might lead some respondents to disqualify themselves if they do not regard their business as innovative.


A one-time data collection is currently envisioned. The inherent drawback of using a one-time survey to address a fundamentally dynamic construct such as innovation will be mitigated by eventually linking the REIS dataset to the Business Employment Dynamics (BED) data at the BLS. BED data utilizes administrative data produced by state employment security departments to track continuance and employment growth or decline for all covered employers in the U.S. Respondent establishments derived from the BLS business establishment list each have a unique ID that will facilitate the linking of these data. For respondent establishments derived from a proprietary list frame, identification information such as business name and address will be retained until these records can be linked with the BED data. This identification information will be removed from the final dataset that will be used only for statistical analysis. Linking these two sources of data will allow the first nationally representative analysis of the impact of innovative activity on the survivability and growth of business establishments.


In constructing the survey instrument, efforts have been made to add value to other Federal information collection activities that are central to studying rural innovation. REIS questions on the availability of finance will provide valuable information to the Federal Reserve Board, which last administered the Survey of Small Business Finances (SSBF) in 2003. The SSBF staff was consulted to ensure that finance data collected for REIS will be comparable and have practical utility for examining the availability of borrowed funds to small businesses.


As the first representative survey or rural innovation in the U.S. the central research interests are to provide an inventory of innovative activity, compare the rate of innovative activity with urban peers, determine what establishment and community characteristics are associated with innovative activity and the longer term outcomes associated with innovative activity:


  1. What percentage of rural establishments in tradable industries introduced product, process or practice innovations in the previous 3 years?

  2. What percentage of self-reported innovative establishments also demonstrates behaviors consistent with substantive innovation?

  3. How do self-reported and ostensibly substantive innovation rates differ by urban/rural location, industry and establishment age?

  4. What establishment and community characteristics are associated with self-reported and ostensibly substantive innovation?

  5. Do ostensibly substantive innovators demonstrate faster rates of employment growth or higher survival rates than claimed innovators and non-innovators?

The two main challenges for the survey are 1) collecting information useful for differentiating nominally innovative establishments from substantively innovative establishments and 2) collecting a comprehensive set of community and establishment characteristics that may be associated with innovation.


Justification is provided for question groups and examples are provided for how individual questions inform the objective of a question group. The first group (Q1 – Q5) asks about the establishment’s vital statistics (age, unit or part of multi-unit firm, main product or service) along with factors that were important for the establishment locating in the community (Q5), if known by the respondent. Q5 is derived from a question asked in the 1996 ERS Rural Manufacturing Survey which was very productive for understanding the rationales behind rural location (see Attachment A).


The next group of questions (Q6 – Q12) focuses on human resources in the establishment. Q6-Q6b collects information on the total number of workers at the establishment, broken out by employees on the payroll and all other workers. The occupational structure of the establishment is investigated in Q10 and the minimal educational requirements for jobs by occupational group is asked in Q10a. In combination with a question on the difficulty of finding qualified applications (Q12), the information will allow examining whether human capital endowments are impeding the competitiveness of rural businesses.


The next group of questions examines the technological and management systems basis of data driven decision-making in the establishment. Management systems consistent with continuous improvement processes that enable data driven decision-making are indicated in Q13, Q24, Q25 and Q26. Technologies that enable data driven decision making are the basis of Q14. Difficulties in adopting digital technologies (Q17) and outside sources of information used by the establishment (Q18 and Q19) provide important context for the bank of questions. In addition to the role that data driven decision-making plays in increasing productivity, prior research also suggests that this information will help differentiate highly innovative establishments from establishments with innovation that is more nominal.


Q27 through Q33 comprise the bank of innovation questions that are derived from the European Union’s Community Innovation Survey (CIS) questions. Including similar questions will allow for international comparisons. However, a noted weakness of this set of innovation questions is their inability to differentiate highly innovative from nominally innovative establishments. Other observable information should be useful in making this distinction. In addition to the data driven decision-making questions discussed above, we also include a question to indicate whether innovation activities are capital constrained (Q34). The CIS question (Q28) as to whether innovation activities have been abandoned or are incomplete may also provide useful information for differentiating establishments. Finally, whether an establishment possesses intellectual property worth protecting (Q37) will also be used to differentiate highly innovative from nominally innovative establishments.


Q35 through Q40 are a group of innovation related questions pertaining to participation in the green economy, trends in innovation funding within the establishment and establishment outcomes. The green economy question is directly relevant to many rural development initiatives in the Department of Agriculture. The trends in innovation funding questions will provide valuable information on how establishments reacted to the economic crisis that will allow examining future impacts on business continuity and employment growth.


Q41 and Q42 ask about the local context of business, first in the form of community characteristics that are potential impediments to competitiveness and second in the form of the activity of community institutions. Q41 will provide a factual basis for assessing the various impediments to rural competitiveness that are currently based on informed speculation. Q42 in combination with other establishment responses and with eventual linking to longitudinal data will inform how these institutions are associated with establishment outcomes.


Q43 addresses user entrepreneurship—selling a good or service developed through own use—that has recently been confirmed as an important component of start-up activity in the US generally (Shah, et al. 2012). Anecdotal evidence suggests it may be even more important in rural areas where “necessity is [more likely to be] the mother of invention.” User entrepreneurs are more likely to conduct R&D and receive venture capital relative to all other entrepreneurs generally, and the study will investigate if this is also true for rural user entrepreneurs.


Q44 through Q47 address sources of finance and business assistance. Although the government finance and business assistance question does not ask explicitly about receipt of American Recovery and Reinvestment Act funds, the time frame for the study will allow examining the extent of assistance during the economic crisis and the importance that businesses attached to assistance. The study period was also characterized by very tight credit markets and Q45 will collect information on where establishments applied for credit and whether applications were denied, partly funded or fully funded.


The questionnaire concludes with questions about the respondent that will be useful in examining possible sources of bias.


Question 3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g. permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

The survey research cooperator for this information collection is the Social and Economic Sciences Research Center (SESRC) at Washington State University. SESRC will use information technology appropriate for mixed survey mode implementation including use of Computer Assisted Telephone Interviewing (CATI) system for both telephone interviewing and for data entry of returned mail questionnaires (see Attachment B and E). SESRC also uses web survey system software for online internet access by respondents (see Attachment C). Both the CATI system and the web survey software system allow for sophisticated programming of complex question branching logic, use of question templates and tailored question and screen visual design. These features are important to best practices for survey administration and quality survey database construction. The CATI system and web system software incorporate automatic numeric range checks on question responses which reduces interviewer entry error. Sophisticated visual design features help improve survey administration as careful design helps respondents and interviewers to more efficiently and effectively see and process the question and supporting information on the computer screen to answer questions and to navigate through the survey.


The CATI system allows for recording of respondents comments and open ended text offered by respondents as clarification to their answers on any given question. These types of features improve the analyses of the survey data and offer a mechanism for recording important information from respondents on their answers. These types of features improve researchers’ ability to assess and analyze the topic of interest and to obtain unique insights regarding respondents and their survey answers.


Question 4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


Surveys to study establishment and firm innovation have been administered in the European Union as part of its Community Innovation Survey (CIS) since 1992. No national level surveys of innovation were conducted in the US until 2008 when the National Science Foundation administered its annual Business R&D and Innovation Survey (BRDIS), a redesign of its Survey of Industrial Research and Development. BRDIS adopted the CIS questions related to various forms of firm-level innovation, but the data are inadequate to examine issues related to rural innovation given that the Congressional mandate for the NSF data collection is to provide estimates of research and development spending and science and technology staffing. To provide cost-efficient estimates of these measures, BRDIS oversamples larger firms with formal research and development activities, firms that are highly unlikely to be headquartered in rural areas. Since BRDIS is a firm level survey, it is impossible to isolate the innovation activities of rural branches of these firms.


However, analyses of the CIS data from EU member countries has demonstrated that rural firms are often as likely as their urban counterparts to introduce new products, processes or marketing methods. These innovations may not result from formal R&D activities but instead represent explorations of novelty that infuse dynamic market-based economies. The Rural Establishment Innovation Survey will provide the first opportunity to examine the prevalence of this broader definition of innovation across urban and rural areas of the US.


The Microenterprise Innovation, Science and Technology (MIST) survey that is currently being developed at NSF will measure innovation, research and development and science and technology staffing in firms with fewer than 5 employees that are excluded from the BRDIS sample frame. MIST will be able to examine innovation activities of rural microenterprises, and thus will also supplement REIS that will only sample establishments with 5 or more employees. In addition to targeting different segments of the firm size distribution, REIS will also specifically examine the constraints to innovation stemming from nonmetropolitan location. These data will not be available in MIST or any other proposed data collections that we know about.


Question 5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.


About 93% of the sample in this study will be small business. We will employ several strategies to reduce the burden on small businesses. First, linking data from responding establishments to existing administrative data discussed in Question 2 will reduce the amount of data collected from small businesses. Second, the survey will be administered using three different modes (telephone, internet and mail) allowing respondents to select the mode which is least burdensome. Finally, in designing the questionnaire, a special effort has been made to request information that should be known to respondents without the need to check business records, and that do not require computation.

Question 6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Innovation in tradable sectors is widely regarded as the key to long-term competitiveness of the U.S. economy. As the first nationally representative sample addressing rural innovation, the collection will provide information on the occurrence of innovation processes in rural areas, on the constraints that rural locations impose on these processes and provide a comparison with urban innovation rates. Failure to collect this information will severely limit the evidentiary basis for improving the innovative capacity of business establishments located in less favorable areas. This clearance is for a one-time collection with no current plans for a future data collection.


Question 7. Explain any special circumstances that would cause an information collection to be conducted in a manner.


There are no special circumstances.


Question 8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8 (d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


The original 60-Day Federal Register Notice (FR Doc. 2011–21848) was published on Friday, August 26, 2011 (76(166):53398-53400). A subsequent 60-Day Federal Register Notice was published on Wednesday, February 6, 2013 (FR Doc. 2013–02607) (Attachment H). ERS received comments from the Bureau of Economic Analysis (Carol A. Robbins, Ana Aizcorbe and Wendy Li; Attachment M); Manufacturing Extension Partnership, National Institute of Science and Technology (Chris Carbone); Small Business Administration (Antonio Doss); Prof. Andrew Reamer, George Washington University (Attachment N); Prof. David Freshwater, University of Kentucky; Prof. Maryann Feldman, University of North Carolina, Chapel Hill (Attachment L); Dr. Stuart Rosenfeld, Regional Technology Strategies, Carrboro, NC; and an anonymous comment from Jean Public (Attachment O). In addition, ERS staff developing REIS also met with representatives from the Federal Reserve Board (Tracie Mach); the National Science Foundation (Audrey Kindlon, David Croson and John Jankowski); and the Kauffman Foundation (E.J. Reedy).


Development of the REIS was also discussed at three forums: STI Data/Indicators Roundtable at George Washington University, June 29, 2011; Kauffman Roundtable on Establishment Surveys, Washington, DC, August 9-10, 2011; and the Interagency Forum on Entrepreneurship and Innovation Statistics, organized by the Kauffman Foundation in Washington, DC, August 9, 2011. A webinar on Designing the Rural Establishment Innovation Survey was held on September 26, 2011, organized by the North Central Regional Center for Rural Development at Michigan State University, was attended by 10 rural development research and extension specialists from around the country. ERS staff also briefed and received comments from a subset of the Science Technology and Innovation Indicators Study Panel, coordinated by the Committee on National Statistics of the National Academies on September 26, 2011.


The comment from Jean Public that the survey represents wasteful government spending is duly noted.


With one exception, all of the substantive comments received have been in regard to improvement of the survey instrument. A comment at the Kauffman Roundtable on Establishment Surveys questioned the exclusion of tourism from the target population of establishments in tradable sectors. Tourism did not meet the criteria used for selecting potentially tradable sectors which was based on the geographic concentration of summary industries (see Jensen, et al. 2005).


Establishment Characteristic Questions


It was suggested that the 3-pages of NAICS codes used in the BRDIS survey be provided to respondents for self-classification. This would increase respondent burden and be redundant for those accurately classified in the BLS and proprietary sample frame. We will collect a qualitative description of the main product or service of the establishment to identify misclassifications.


Innovation Questions


The convention from the Community Innovation Survey questions to exclude aesthetic design changes as a source of innovation is relaxed by removing “changes of an aesthetic nature” from excluded changes. In addition, response categories on innovation related activities now include in-house and purchased design services.


Suggestion to add “began exporting in past 3 years” as a category to the question asking about establishment innovation was not incorporated. This is an outcome of an innovation that is already asked about (new or significantly improved marketing methods) and this outcome can be gleaned from other information collected.


Separate questions on improvements in goods and improvement in services combined into one question, as the two questions share common response categories.


Suggestion to expand list of innovation-related activities beyond purposes of commercialization to include creation of brand equity “for the firm” was not incorporated as it is less relevant for an establishment survey, and may not be relevant for a large number of unit plants interviewed for the survey.


User Entrepreneurship Questions


One commenter suggested deleting the questions regarding user entrepreneurship, while another commenter argued for keeping them. A recent study using the Kauffman Firm Survey (Shah, et al. 2012) provides the first empirical evidence that user entrepreneurship is an important phenomenon for understanding innovation and financing of start-ups.


Finance Questions


The categories of types of finance used were expanded to make them more comparable to other business surveys such as the Survey of Business Owners. In addition, a category for Credit or Advance from Customer was added based on responses on a recently administered survey by one of the commenters.


Skip patterns were changed so that respondents can report how funds were to be used, even if they were unable to secure funding.


A hypothetical question asking about the most likely uses for surplus funds in order to identify capital constraints was simplified based on several comments.


On the question related to use of funds, the response category on R&D was changed to “Fund innovation projects” and a category was added related to “intangible assets such as branding, training or design.”


Question on innovation finance was removed to reduce respondent burden and avoid confusion with question asking about general financing of the business.


Data Driven Decision-Making


Simplified questions on important sources of information for the establishment. There is still a break out of where information came from geographically, but this applies generally to all forms of information rather than being broken out in terms of business opportunities, new technology, etc.


Suggestion to replace subjective geographic categories (in community, outside community but within a reasonable drive, and beyond a reasonable drive) with one-hour drive and three-hour drive options was not incorporated. Inclusion in a community may overlap the one-hour drive threshold in remote areas, and for the same reason it is not clear that a 1 to 3 hour drive defines a regional geography. Alternatively, many areas within a 3 hour drive may not be considered part of the region nor a reasonable distance to travel for face-to-face communication. The actual physical distance or travel time is much less important than perceived distances respondents use to categorize interaction with other businesses.


Deleted question that broadly sought to characterize availability of data to make decisions which had been used in previous studies of data driven decision-making.


On the question related to difficulty in adopting information technologies, a category on “Difficulty integrating new technologies into the current way of doing business” was added.


Human resources policies


Question about compensation changed to average hourly wage for non-salaried workers.


Question asking about formal training of employees was expanded to include a number of other human resource policies including whether establishment offers health insurance option for full time employees, retirement plan, maternity leave, or has an employee ownership plan.


Community Characteristics


A skip pattern was removed so that all respondents can answer questions about factors important to locating in the community and adding a “Don’t Know” response category, rather than limiting the question to newer establishments.


Suggestion to add questions related to triple bottom line outcomes (social, economic and environmental) of establishments were not incorporated due to need to limit respondent burden and limited connection to overall objectives of the survey. However, a question was added asking about the level of civic leadership provided by the business to the local community which provides information on the issue and is closely related to the rural development research objectives of the survey.


Government Programs


Added National Innovation Marketplace to the list of government programs rated on basis of importance/value by respondents.



Question 9. Explain any decision to provide any payment or gift to respondents, other than reenumeration of contractors or grantees.


SESRC will utilize token monetary incentives to a subset of the sample frame in conjunction with mixed survey mode implementation as an experimental trial in the pilot study to improve overall response rates to this establishment survey (see Attachment D, fourth letter). The purpose and use of incentives and experimental design is to learn more about mixed mode survey implementation and the use of token incentives with establishment respondents in order to select mode sequence/token incentive combination to maximize response rates in the full study. It is important to determine if experimental interventions are appropriate and successful with this type of survey environment as well as learning if these interventions influence survey results both by mode and use of cash incentives.


Use of cash incentives has several variables that can be varied that can impact the survey outcomes including: timing of incentive, combination with postage and package variation, amount of incentive, novelty of incentive and interaction with number of previous contacts. SESRC will be conducting carefully implemented and controlled survey methodology experiments in the pilot study. The experimental variables and interactions to be researched will be: 1) survey mode sequencing ( main data collection by telephone first vs. mail first); 2) staged offering of web link with email augmentation; 3) use of novel token monetary incentives (e.g., Jefferson $2), timed offering of incentive (early in the advance letter versus staged later combined with the 1st or 2nd mail questionnaire) and 4) postage class and packaging intervention (first class postage and two-day priority mail; and number of times priority mail is used combined with cash incentive; and whether this happens early or late in the data collection for the mail sequence). All experimental tests will be comparative. The base control group is group 5 with telephone mode as the main data collection first; all mailings will NOT have cash incentives; all mailings first class postage (No two-day priority mail class); web link offering and email augmentation is late in data collection. The explicit timing of these mode sequences are provided in Table 3, Supporting Statement Part B. One-fifth of the pilot respondents will receive no incentive, two-fifths of pilot respondents may receive a single incentive of $2 depending on when they respond, and two-fifths of pilot respondent could receive an additional incentive of $2 for a total of $4 if they do not respond to the initial incentive. Justification for the use of repeated incentives comes from leverage-saliency theory where the decision to participate in a survey is thought to be influenced by additive and interactive factors. Empirical verification of this effect in a survey of physicians’ offices is instructive as many of the same environmental factors (e.g., gatekeepers) also characterize the target population of businesses (Moore and An 2001). .


The nominal value ($2) of the incentive is to convey a token of appreciation with the goal of emphasizing the importance of the research and invoking social exchange and reciprocation. It is also important that the token incentive remain nominal in value to not be construed as an economic payment for time invested by respondents in answering the questions. Research with household survey respondents has shown that monetary cash incentives are effective for enhancing response rates to surveys but have mixed results on whether survey answers are significantly influenced by cash incentives and survey modes. This same information has not been well demonstrated for business respondents and establishment surveys and that is one goal of this survey strategy and planned research design.


The mode sequence selected for the full study will be contingent on findings from the pilot and the factors surrounding this decision will be fully elaborated in the pilot study assessment report submitted to OMB. The three general outcomes anticipated are statistically significant higher response rate of one mode sequence over all others, statistically significant higher response rates for two or more mode sequences over remaining mode sequences without identification of a clear dominant mode sequence, or failure to discern statistically significant differences in response rates across all mode sequences. The mode sequence selected in the first and last case would be the one with highest response rate or lowest cost, respectively. In the middle case mode sequences with statistically significant lower response rates would be abandoned and the survey would be administered by allocating an equal share of potential respondents to the remaining mode sequences.



Power analysis to determine whether the mode sequence experiment will be likely to provide a definitive answer if in fact some mode sequences dominate others with respect to response rate. The lowest cost mode sequence will be the default if no significant difference in response rate is detected. With an initial sample of 4,000 there will be 800 potential respondents assigned to each mode sequence group. We would like to be able to detect an increase in response rate of at least 5%, for example, improving from 55% to 60%. Unfortunately, computed power is only 0.525. Our initial sample size would have to nearly double to produce a test with power of 0.8. However, if the effect size is increased to a 7% improvement in response rate then power increases to 0.807. Detecting a large improvement in response rate of 10% would generate a power test of 0.981. Thus, the proposed sample size has a high probability of detecting a medium to large improvement.


The POWER Procedure

Pearson Chi-square Test for Two Proportions

Fixed Scenario Elements

Distribution

Asymptotic normal

Method

Normal approximation

Number of Sides

2

Group 1 Proportion

0.6

Group 2 Proportion

0.53

Group 1 Sample Size

800

Group 2 Sample Size

800

Null Proportion Difference

0

Alpha

0.05


Computed Power

Power

0.807



Question 10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

The confidentiality of the Rural Establishment Innovation Survey data is protected under the statutes of U.S. Code Title 18, Section 1905, U.S. Code Title 7, Section 2276, and Title V of the E-Government Act, Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA), (Public Law 107-347). A full statement is provided in mail questionnaires and web surveys confirming that answers to the survey will be kept confidential, and that under no circumstances will identifying information about individuals or their organizations be released to any unauthorized individuals, agencies, or institutions. It will assure respondents that only aggregated statistics will be reported, and that providing answers to any or all questions is strictly voluntary. In addition, detailed disclosures regarding confidentiality will be provided in an advance letter to respondents (see Attachment D), and enumerators will check to ensure that respondents have received and read the letter and disclosures prior to conducting the survey. Enumerators can send a copy of the confidentiality statement to respondents email address in real time if requested. ERS will use established procedures for survey storage and disposal to ensure that individual identifiers are protected from disclosure. ERS will also use statistical disclosure limitation methods to ensure that individual identifying information does not appear in any public data product.

ERS and ERS contractors comply with OMB Implementation Guidance, “Implementation Guidance for Title V of the E-Government Act, Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA), (Public Law 107-347)”. In conformance with existing law and Departmental regulations, it is the policy of the ERS that respondent identifiable information collected or maintained by, or under the auspices of, the ERS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that will ensure that the information will be used only for statistical purposes and will be accessible only to authorized persons.

ERS and its contractors will comply with the computer and staff requirements associated with CIPSEA, including PIA compliance and FISMA compliance. The ERS contractors are in the process of obtaining the Authority to Operate under FISMA. Data collection efforts will not begin until ATO is in place.

All ERS contractor project staff including research interviewers and data collection supervisors will receive CIPSEA training.  They are required to sign a confidentiality statement pledging adherence to CIPSEA rules and protocols designed to maintain the confidentiality of respondent identity and data, under the penalty of a fine of up to $250,000 or imprisonment of up to 5 years, or both.


The report forms for this information collection will give respondents the following assurance of confidentiality:


YOUR RESPONSE IS VOLUNTARY. YOUR ANSWERS TO ALL QUESTIONS ARE CONFIDENTIAL. All information that is provided by participants to the National Survey of Business Competitiveness will only be used for statistical research purposes and reported in summary form. Your name and that of the business you represent will not be connected to your answers in any way. In accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and other applicable Federal laws, individual responses will be kept confidential and will not be disclosed in identifiable form to anyone outside of the research team. By law, every ERS and SESRC employee is subject to a jail term of up to 5 years, a fine of up to $250,000 or both if he/she discloses any identifiable information about research participants.


Due to the use of the QCEW business registry as list frame, all persons involved in the production of the REIS at both ERS and SESRC (including all interviewers and technical staff) are required to be designated agents of BLS, satisfactorily complete BLS confidentiality training and are sworn to uphold the following confidentiality statement:


I, [designated agent], fully understand my responsibilities to protect confidential information. I will comply with all security requirements and avoid all improper use or disclosure of confidential information.”


Question 11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


This information collection asks no questions of a sensitive nature.


Question 12. Provide estimates of the hour burden of the collection of information. Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. This is a two part question and both parts must be addressed. A) Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I. B) Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.



The estimated numbers for the entire study includes:


1) Sample Size: Estimate of the initial sample of 64,000 businesses (4,000 for the pilot study and 60,000 for the full study). For the pilot sample that included telephone prescreening, 60% completed the screener. Prescreening the proprietary sample will be done for the full study but will not be done for the BLS sample since few establishments in the pilot were found to be ineligible and identifying a specific contact did not substantially improve response rates. Ineligibility for the BLS sample will be determined in the course of data collection and this percentage will be applied to nonrespondents in calculating response rates. Based on the best contact sequence from the pilot that will be used in the full study, we expect 8.8% of the initial sample are refusals during data collection, or 5,297 cases. Given final response rates from the pilot survey we expect that an initial sample of 60,000 will result in roughly 17,000 completes.


2) Length: Full interview time per respondent completing interview observed during the pilot data collection was 30 minute average or .50 hours for telephone and a 22.2 minute average for web completes. We have no record of the time for mail completes but assume it would be similar to the time for a web complete.


3) Burden hours: Estimated response burden over the initial sample of 4,000 for the pilot study and 60,000 for the full study is 15,208 hours over all respondents and case dispositions and survey modes.



PILOT STUDY 30 MINUTE INTERVIEW

 

 

 

 

 

 

 

 

 

Estimated Number of Respondents

Responses Annually per Respondent

Total Annual Reponses

Est Ave Number of Hrs per Resp*

Est Total Annual Hrs Resp Burden

Anticipated Completes

Business Phone Screener initial sample

 

4,000

 

 

 

 

 

Completed screener interviews

80%

3,200

1

3,200

0.07

224

3,200

Attempted interviews (number not completing)

20%

800

1

800

0.04

32

 

Main Survey:

 

 

 

 

 

 

 

Phase 1: Business Phone Interview to All respondents

 

 

 

 

 

0

 

Completed interviews

47%

1,504

1

1,504

0.5

752

1,504

Mail Short Form for Tel. Refusals

 

 

 

 

 

 

 

Completed 1 pg questionnaire

2%

64

1

64

0.1

6.4

64

Phase 2: Business 1st Follow-up Mail Survey to nonrespondents

 

 

 

 

 

 

 

Completed questionnaires

9%

288

1

288

0.5

144

288

Phase 3: Business 2nd Follow-up Mail Survey to nonrespondents

 

 

 

 

 

 

 

Completed interviews

6%

192

1

192

0.5

96

192

Phase 4: Business Web Questionnaire

 

 

 

 

 

 

 

Completed questionnaires

7%

224

1

224

0.5

112

224

Phase 1 to 4








Attempted

Interviews/contacts

29%

928

1

928

0.476

442


Total responding burden






1,8081

2,272


* Estimates are rounded.

1 The observed burden in the course of pilot data collection was 1,202 hours. Unfortunately, during the course of the pilot data collection 1,208 additional cases that were not authorized were inadvertently released for a total sample size of 5,208. The lower burden than originally estimated was due to respondent preference for more time efficient survey modes, lower burden for nonrespondents as repeated contacts did not substantially increase response rates and thus were less frequent than originally planned, and the number of nonrespondents was significantly higher than anticipated.





Survey

Samp Size

Freq

Resp Count

Freq

Count

Min/Resp

Burden

Hours

Nonresp

Count

Freq X Count

Min/

NonResp

Burden Hours

Total Burden Hours

REIS Main Study












Screener for Proprietary Sample




3612

1




2167




2167




4.2

152

1445

1445

2.4

58

209

Advance Letter and Publicity Materials for BLS Sample and Screened Proprietary Sample

58555

1

58555

58555

2.4

2342

2342

Phase 1: Business Mail Survey to all respondents

58555

1




6,120




6,120

22.2

2264





2264

Phase 2: Business 1st Follow-up Mail Survey to nonrespondents

52435

1





4,320





4,320

22.2





1598





1598

Phase 3: Web Survey to nonrespondents

48115

1



4,428



4,428

22.2



1638





1638

Phase 4: Phone Interview to nonrespondents

43687

1



720



720

30

360





360

Mail Short Form for Tel. Refusals

42967

1



1,440



1,440

6



144







144

Attempted interviews/contacts Phases 1 Through 4

41527






41527

41527

7




4845




4845

Total

60,000


17,028








13400

* Estimates are rounded.




4) Detailed Assumptions: The estimated number of respondents for the full study includes: (1) 60,000 businesses, with roughly 56,000 drawn from the Bureau of Labor Statistics Business Registry and roughly 4,000 drawn from a proprietary sample for the 5 states not in the BLS sample frame; the proprietary sample will be screened for size eligibility. It is expected that 60% of the proprietary sample will complete the screener and be eligible and or 40% will not based on pilot study experience; (2) of the 58,555 businesses determined to be eligible for the survey, approximately 17,028 are expected to agree and participate in one of the mail/web/phone survey designed to collect business characteristics and business information relative to innovation and business development, and 41,527 (will be nonrespondents over all phases of contact. The total burden for nonrespondents assumes multiple attempts that totals 7 minutes on average based on pilot study findings. (3) It is expected that during the 1st mail phase of contact, 6,120 or 10.4% of all respondents will complete and return the mail questionnaire; (4) nonrespondents not refusing the survey will be sent the second mail questionnaire and it is expected that 4,320 or 7.4% of businesses will complete and return a mail questionnaire; (5) after the mail phase, nonrespondents will flow to the next survey mode and these businesses will be contacted by telephone. It is expected that 720 or 1.2% will complete the telephone interview while there may be a couple more web completes due to the telephone contact; (6) Nonrespondents not refusing the survey at the last stage will be sent a short one-page questionnaire for obtaining essential information only. 1,440 or 2.4% of businesses are expected to complete this short form questionnaire. (7) Throughout the study a web questionnaire will be offered and available to respondents during the phases of data collection and it is expected that 4,428 or 7.6% of respondents will complete questionnaires over the web.



Of the 58,555 businesses contacted for the full study, approximately 7.6% will respond via the web, approximately 1.2% over the telephone, 17.8% will complete a mail questionnaire and additionally approximately 2% to 3% will return the targeted refusal short 1 page questionnaire. Overall the response rate is expected to be approximately 30%. Estimates of the percentages of respondents who will agree to complete questionnaires are based on experience from pilot study survey data collections.0…..


The Economic Research Service will notify the Office of Management and Budget (OMB) of the results of the pilot study before proceeding with the full survey, and submit any changes to the study materials or survey questions to OMB for clearance.  It is anticipated that any changes would be non-substantive, with no upward revision of burden hours, scope or design of the study.


Estimate of Annualized Cost to Respondents:


Representative median wage for Marketing Managers derived from the Bureau of Labor Statistics Occupational Employment Statistics for May 2011 (http://www.bls.gov/oes/current/oes_nat.htm#11-0000) is $55.78/hr. Marketing Managers were chosen as the representative occupation as they tend to have a very broad knowledge of a business’s operation and would make ideal respondents from larger establishments. In the majority of small establishments the respondent will be the owner/chief executive officer but their wages will tend to be less than the national median. Unfortunately, occupational statistics are not available by business size class. Wages paid in metropolitan and nonmetropolitan areas will also differ substantially. Median wages for Marketing Managers tend to cluster around $40.00/hr in predominantly rural states and around $60.00/hr in predominantly urban states. These values are consistent with a national median of $55.78/hr and with typical urban/rural wage differentials. The annualized cost is computed assuming one-third of the respondent burden is in metropolitan areas and two-thirds of the respondent burden is in nonmetropolitan areas.


Respondent Subsample

Median Wage X

Respondent Hours

Annualized Costs

Metropolitan

$60.00/hr

5,069

$304,140

Nonmetropolitan

$40.00/hr

10,139

$405,560

Total


15,208

$709,700




Question 13. Provide estimates of the total annual cost burden to respondents or record keepers resulting from the collection of information, (do not include the cost of any hour burden shown in items 12 and 14). The cost estimates should be split into two components: (a) a total capital and start-up cost component annualized over its expected useful life; and (b) a total operation and maintenance and purchase of services component.


There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection.



Question 14. Provide estimates of annualized cost to the Federal government. Provide a description of the method used to estimate cost and any other expense that would not have been incurred without this collection of information.


The total estimated cost for this survey in fiscal year 2011 through 2013 is $1.833 million of which the Economic Research Service is providing 50% time of a GS-14 economist for 3 years estimated at $183,000 and $1.65 million was provided by the USDA Rural Development Mission from 2010 end of year funds.


Question 15. Reason for Change in Burden

This is a new data collection.



Question 16. For collections of information whose results are planned to be published, outline plans for tabulation and publication.


Pilot study data collection is expected to begin July 2013 with full study collection expected to begin in September 2013. Data collection should be completed by February 2014 with cleaned data available for analysis in March 2014. A technical report for internal ERS use will be completed by May 2014. The technical report will tabulate responses to all survey questions for the entire sample and for rural and urban subsamples and provide statistical tests of differences across the subsamples as discussed in Part B, Question 2. The technical report will form the basis for compiling the first published report from the survey data with the working title “Rural Innovation at a Glance.” As with other ERS “…at a Glance” reports, the objective is to provide an easily accessible tri fold brochure that provides salient information on the topic. The report will answer the three central descriptive questions underlying the data collection:


  1. What percentage of rural establishments in tradable industries introduced product, process or practice innovations in the previous 3 years?

  2. What percentage of self-reported innovative establishments also demonstrates behaviors consistent with substantive innovation?

  3. How do self-reported and ostensibly substantive innovation rates differ by urban/rural location, industry and establishment age?


It is expected that “Rural Innovation at a Glance” will be submitted for clearance by July 2014 and published later in the year.


A much more detailed ERS Economic Research Report will discuss the conceptual framework underlying the classification of substantive innovators; examine the prevalence of innovative activity across settlement types, industry groups and establishment age categories more thoroughly; and provide information on the association between innovative activity and establishment and community characteristics by policy relevant subpopulations. The report will provide the first generalizable information on the phenomenon of rural innovation in the U.S. Findings will be discussed with relevant Federal agencies and international organizations such as NSF’s National Center for Science and Engineering Statistics, the Science, Technology and Innovation Indicators Panel at the National Academies and the Directorate for Science, Technology and Industry at OECD. An ERS cleared draft will be available for these discussions by October 2014 with the revised manuscript submitted for publication clearance by December 2014 with publication anticipated in early 2015.


It is anticipated that these data will provide the evidentiary basis for manuscripts submitted to academic journals focusing on the economic geography of innovation. Possible topics include rural user entrepreneurship, the information networks used by innovators in more remote locations, and the relationship between innovation and the geographic extent of the market. Exploration of possible research topics will begin with the compilation of the technical report in early 2014 with submission of manuscripts anticipated in late 2014 through 2015.


The final planned output from this data collection will examine the association between innovation and employment growth and establishment continuity. This analysis will require linking the REIS dataset with Quarterly Census of Employment and Wages data at the Bureau of Labor Statistics. ERS currently has a Memorandum of Understanding with BLS to link the 1996 Rural Manufacturing Survey with QCEW data to examine factors related to manufacturing employment growth and continuity. ERS will submit a research proposal to the BLS Onsite Researcher Program in October 2015 to begin linking the datasets in 2016. The two-year duration of the MOU will allow examining employment growth and continuity for a two to four year interval.



Question 17. Request to Not Display Expiration Date

The assigned expiration date will be displayed on all collection instruments used in this information collection.


Question 18. Explain each exception to the certification statement identified in Item 19, "Certification for Paperwork Reduction Act Submissions," of OMB Form 83-I.

The agency is able to certify compliance with all provisions under Item 19 of OMB Form 83-I.


Attachments


Attachment A Draft Rural Establishment Innovation Survey (sent out as National

Survey of Business Competitiveness)

Attachment B Final CATI Script

Attachment C Screen shots of the Rural Establishment Innovation Survey Internet

Application

Attachment D Draft Rural Establishment Innovation Survey Letters

Attachment E Draft FAQ and Help Text for Interviewers

Attachment H 60 Day Federal Register Notice

Attachment I 7 USC 2662(b) Programs Authorized Rural Development

Attachment L UNC Comments

Attachment M BEA Comments

Attachment N GWU Comments

Attachment O Comments from Jean Public




References


Dillman, Don A., J.D. Smyth, and L.M. Christian (2009) Internet, Mail, and Mixed Mode Survey: The Tailored Design Method. 3rd Edition, John Wiley & Sons, Inc., Hoboken, New Jersey.


Jensen, J. Bradford, Kletzer, Lori G., Bernstein, Jared, Feenstra, Robert C. (2005) “Tradable Services: Understanding the Scope and Impact of Services Offshoring,” in Brookings Trade Forum: Offshoring White-Collar Work, eds. Susan Margaret Collins and Lael Brainard, pp. 75-133.


Moore, Danna, An, Larry. (2001). “The Effect of Repetitive Token Incentives and Priority Mail on Response to Physician Surveys,” Proceedings of the Annual Meeting of the American Statistical Association, August 5-9, 2001. Accessed 12/4//2012 http://www.amstat.org/sections/srms/proceedings/y2001/Proceed/00186.pdf.


Shah, Sonali K., Smith, Sheryl Winston, Reedy, E.J. (2012) “Who Are User Entrepreneurs? Findings on Innovation, Founder Characteristics and Firm Characteristics.” Kauffman Foundation Report (February), Kauffman Foundation, Kansas City, MO. Accessed 3/14/2012 http://www.kauffman.org/uploadedFiles/who-are-user-entrepreneurs.pdf.





0 The definition of innovation used here has been attributed to Hutch Carpenter. The OECD/Oslo Manual definition used by the National Science Foundation in its Business R&D and Innovation Survey (BRDIS) is similar to the first half of the Carpenter definition though more specific in spelling out the various types of business model or operations changes. “An innovation is the implementation of a new or significantly improved product (good or service), or process, a new marketing method, or a new organizational method in business practices, workplace organisation or external relations.” The OECD /BRDIS definition does not directly incorporate an increase in economic value but elaboration of the concept by the OECD notes the importance of an increase in economic value in measuring innovation. The Carpenter definition was chosen here as it is both more concise and more complete, combining both the concepts of newness or change in the OECD definition with the important qualification that changes result in an increase in economic value.

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement
Authorallen001
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy