Pilot Study Report 13-084

Pilot Study Report 13-084.docx

Rural Establishment Innovation Survey (REIS) (Also Known as National Survey of Business Competitiveness)

Pilot Study Report 13-084

OMB: 0536-0071

Document [docx]
Download: docx | pdf

Shape1




PILOT STUDY DATA REPORT 13-084



National Survey of Business Competitiveness

(Pilot Study)


April 2014


CONFIDENTIAL NOT FOR USE WITHOUT PERMISSION





Prepared for



Timothy Wojan Resource and Rural Economics Division Economic Research Service, USDA

(202) 694-5419 twojan@ers.usda.gov


Submitted by


Danna L. Moore, Ph.D. Principal Investigator and

Yi Jen Wang, MA Study Director and

Kent Miller, MA Study Director


on behalf of SESRC






PILOT STUDY: NATIONAL SURVEY OF BUSINESS COMPETITIVENESS




APRIL 2014






ERSR10






Submitted by




Danna L. Moore, Ph.D. Principal Investigator and

Yi Jen Wang, M.A. Study Director and

Kent Miller, M.A.


Study Director




Social & Economic Sciences Research Center

PO Box 644014; Wilson-Short Hall 133

Washington State University

Pullman, WA 99164-4014

509-335-1511

509-335-0116 (fax) SESRC@wsu.edu

Shape2 SESRC Project Profile


Title: Pilot Study: National Survey of Business Competitiveness (ERSR10)


Objectives: The main purpose of this pilot study is required to obtain specific information that will allow SESRC to evaluate and modify (if necessary) the study design for the REIS main study. The results/findings of the pilot study (e.g. evaluating cost, incentive use, and contact sequence) will be primarily used for informing any proposed changes in the main study.


Abstract: USDA's Economic Research Service sponsored a survey of US businesses to examine the challenges firms are facing in todays economy. SESRC sent postal letters describing the study and invited

5,210 respondents to complete a mail questionnaire, an internet based questionnaire, or a telephone interview.


Method: Using a Tailored Design Method survey protocol, a mixed mode (telephone, mail and web) survey was implemented. The sample was divided into 5 different groups with different experiments on the contact/token incentive sequence. Each group had different contact methods at different phases.


Results: 623 respondents completed the mail survey, 729 respondents completed or partially completed the web survey, 16 respondents completed the short web survey, and 119 respondents completed or partially completed the telephone interview, yielding a response rate of 28.4%. Group 3 (Web First), started with a prior letter and a $2 incentive, followed by a mail questionnaire with a $2 incentive and a replacement questionnaire and followed by telephone reminders, had the highest response rate out of the 5 experimental groups.


Timeframe: November 2013 through February 17th, 2014


Sponsor: Timothy Wojan

Resource and Rural Economics Division

Economic Research Service, USDA

355 E Street SW Washington, DC 20024

(202) 694-5419

twojan@ers.usda.gov


Principal Investigator: Danna L. Moore, Ph.D.

Study Directors: Yi Jen Wang, M.A. and Kent Miller, M.A

Data Manager: Dan Vakoch, M.S.

SESRC Acronym: ERSR10

Data Report Number:13-084

WSU IRB Number: 12680


Deliverables: Data Report, SAS Frequency listing, CATI script, paper questionnaire, and a copy of the web survey screenshots.

Table of Contents

SESRC Project Profile ............................................................................................................................................... i

I. SURVEY ADMINISTRATION AND DESIGN ..................................................................................................1

Background and Objectives ..................................................................................................................................1

Population and Sample...........................................................................................................................................1

Questionnaire Design..............................................................................................................................................2

Telephone Prescreening ........................................................................................................................................2

II. SURVEY IMPLEMENTATION AND PROCEDURES ..................................................................................4

Human Subjects Review.........................................................................................................................................4

Survey Design and Contact Procedures...........................................................................................................4

Table 1. Pilot Study Contact Sequence ................................................................................................5

Telephone Interviews .............................................................................................................................................7

Table 2. ERSR10 Pilot Study Data Collection Telephone Statistics .........................................7

Refusal Prevention ...................................................................................................................................................7

Table 3. Telephone Refusal Conversion Statistics...........................................................................8

III. CASE DISPOSITION AND RESPONSE RATES ..........................................................................................9

Table 4. Sample Disposition and Response Rates ........................................................................ 11

Table 4 Continued...................................................................................................................................... 12

IV. SURVEY RESULTS ........................................................................................................................................... 13

Completes by Mode............................................................................................................................................... 13

Table 5. Number of Pilot Study Full Survey Completes by Mode........................................... 13

Table 6. Number of Pilot Study Full Survey Completes by Groups ....................................... 13

Table 7. Most and Least Pilot Study Full Survey Completes Mode vs. Groups ................. 14

Response Burden ................................................................................................................................................... 14

Table 8. Pilot Study Full Survey Web Completion and Completion Time Statistics....... 14

Table 9. Pilot Study Short Survey Web Completion and Completion Time Statistics ... 15

Nonresponse Burden ........................................................................................................................................... 15

Table 10. Telephone Nonresponse Burden..................................................................................... 16

V. Pilot Study Final Results ................................................................................................................................ 17

Sample ........................................................................................................................................................................ 17

Table 11. Number of Pilot Study Full Survey Completes and Partial Completes by

Sample Source ............................................................................................................................................. 18

NAICS Code Coverage........................................................................................................................................... 19

Table 12. Number of Completes by NAICS Codes ......................................................................... 19

Table 13. Number of Completes and Partial Completes to the Pilot Full Survey by

Establishment Size .................................................................................................................................... 22

Table 14. Comparison of Pilot Study Sample Frame to Completion by Business Size .. 23

Table 15. Number of Pilot Study Completes by Metro/Non Metro Area for BLS Sample

........................................................................................................................................................................... 23

Telephone Prescreening ..................................................................................................................................... 23

Table 16. Pilot Study Completes Only vs. Prescreening Final Results ................................. 24

Table 17. Pilot Study Prescreening Updates Summary.............................................................. 25

Refusal Short Form Letter.................................................................................................................................. 25

Contact Sequence................................................................................................................................................... 26



ii

Table 18. Response Rate History for All Groups ........................................................................... 26

Chart 1. Response Rate History by Group ....................................................................................... 27

Chart 2. Response Rate Comparison between Group 1 and Group 4 ................................... 28

VI. SESRC’s RECOMMENDATIONS FOR THE FULL STUDY ................................................................... 30

Table 19. Contact Sequence for the Full Study .............................................................................. 32

VII. SURVEY RESULTS.......................................................................................................................................... 35

REFERENCES....................................................................................................................................................... 38

VIII. Full Study Survey Instrument ................................................................................................................. 39

Proposed Advance Letter for Full Study ...................................................................................................... 39



I. SURVEY ADMINISTRATION AND DESIGN Background and Objectives

The Social and Economic Sciences Research Center (SESRC) at Washington State University


(WSU) collaborated with the USDAs Economic Research Service (ERS) to conduct a mixed mode survey of business establishments to examine the challenges business firms face in todays economy. The study consists of two phases. The first phase is a mixed survey mode pilot study with about 5,300 establishment respondents. The goal of the pilot is to evaluate the mixed mode survey implementation practices in order to determine and select the most effective survey mode sequence and use of token incentive combinations to maximize response rates within the budgeted resources for the full study. The second phase is the full study phase and is scheduled to start in spring, 2014.


The main purposes of the pilot are (1) to evaluate the mixed mode survey implementation practices in order to determine and select the most effective survey mode sequence and use of token incentive combinations to maximize response rates within the budgeted resources for the main study; and (2) to inform any proposed changes in the main study based on results of the pilot study.


Information was collected over a 12 week period from November 2013 to February 2014. The findings will contribute to a better understanding of how increasing international competition and the increasing knowledge of economic activity in the U.S. are affecting the economic vitality of rural areas and the conditions associated with businesses making effective adjustment to these pressures.


This report describes the final results of the pilot study.



Population and Sample


The population for the pilot study was business establishments with more than five employees in the tradable industries defined as mining, manufacturing, wholesale trade, transportation and warehousing, information, finance and insurance, professional/scientific/technical services, arts, and management of businesses. While the focus of the survey was on establishments in nonmetropolitan (rural) counties,


establishments from metropolitan counties were also sampled in adequate numbers to allow for comparative analysis. Businesses were selected at random from strata defined by establishment size categories, industry codes (NAICS), and metropolitan or nonmetropolitan status of the county.


For the pilot study, the sample includes two sub-components--the sample from the 1996


Rural Business Survey (n=2,493) and the Bureau of Labor Statistics sample (n=2,804). These


5,296 cases were prescreened before the pilot study started to determine if the firm was in business or had closed, to update business contact information, and identify a representative for directing survey contacts. (The results of the prescreening are reported under separate cover in the Confidential SESRC Data Report 13-083).


Questionnaire Design


SESRC worked collaboratively with Tim Wojan and representatives of the Rural Division of ERS to develop a paper questionnaire for this pilot study. The survey included both closed- ended and open-ended questions. Once the questionnaire was finalized, it was programmed into SESRC’s web survey format and data entry program as well as the Voxco Computer Assisted Telephone Interviewing System (CATI).


The paper questionnaires were printed in color on 11” x 17” white paper and stapled together to form a 16-page, 8 ½” x 11” questionnaires with a large title and multiple pictures on the first page designed to generate interest in the survey.


The final Internet version contained 83 screens, including an introductory screen and a survey completion screen. It contained 254 data points, of which 29 had an open-ended response component to them.


Telephone Prescreening


SESRC prescreened the 1996 Rural Business Survey samples (n=2,493) and the Bureau of Labor Statistics sample (n=2,804) from 9/12/2013 to 10/31/2013. The purpose of this contact was to: 1) determine if establishment is still in business, 2) update business contact information (mail address, telephone, email, and web URL), and 3) identify a representative for the establishment for directing survey contacts. All cases received at least one call attempt for the prescreening.




Overall, SESRC updated 13.0% (n=345) of the business names, 31.9% (n=850) of the telephone numbers, 28.7% (n=765) of the email addresses, 71% (n=1891) of the web URLs,

81.0% (n=2157) of the contact names, 80.1% (n=2132) of the contact titles, and 39.1% (n=1042) of the mail addresses out of the 5297 total cases from the prescreening. In addition,

32 businesses were found to be no longer in operation and 36 businesses had a company policy not to do surveys. They were removed from the final sample for the pilot study.


A total of 5,210 cases were then divided into 5 different experimental groups for the pilot study.



II. SURVEY IMPLEMENTATION AND PROCEDURES Human Subjects Review

SESRC submitted the project design and questionnaire to the Institutional Review Board at Washington State University (WSU-IRB) for review of procedures for conducting research with human subjects and compliance with federal regulations. The survey procedures and materials were determined to be exempt by the WSU/IRB. The study was assigned WSU IRB # 12680 and the review was completed on July 20,

2012.



Survey Design and Contact Procedures


SESRC uses Tailored Design Method1 (TDM) survey procedures to conduct surveys. Key elements of TDM survey procedures are to implement carefully designed and timed contacts to respondents. For this survey, respondents from each experimental group received one postal pre-notification letter at the beginning of the study and then received different combinations of telephone, postal or email contacts sequences throughout the data collection period. The pilot study phase included a test of 5 experimental

treatments. Table 1 shows the contact sequence for each experimental treatment group.





























Shape9 1 Dillman, Don A.; Smyth, Jolene D.; Christian, Leah M. 2009 Internet, Mail, and Mixed-Mode Surveys: The Tailored Design

Method (3rd Edition). New York: Wiley.


Table 1. Pilot Study Contact Sequence



Group 1

Date

Mail First

Group 2 Group 3


Telephone Web


First First

Group 4


All


Options

Group 5


Control


Group


Prescreen


9/12~10/31

Phone contact

Phone contact

Phone contact

Phone contact

Phone contact



Phase 1



11/12/2013


Prior Letter


(n=1042)

Prior


Letter


(n=1041)

Prior


Letter


(n=1042)

Prior


Letter


(n=1043)


Prior Letter


(n=1042)



Phase 2



11/22/2013

1st QSTR


Mailing


(n=1042)

Phone


Contact 1st attempt

1st Email


Contact


(n=289)


Phone


Contact 1st attempt


Phase 2 (Group 4)



11/23/2013




1st QSTR Mailing (n=1041)




Phase 3



11/26/2013

1st Email


Reminder


(n=317)

Phone


Contact


Continues



-

1st Email


Reminder


(n=345)

Phone


Contact


Continues




Phase 4




12/2/2013




-

Phone


Contact


2nd attempt




-




-


Phone Contact Continues



Phase 5



12/11/2013

1st PC


Reminder


(n=898)

Phone


Contact


Continues

1st QSTR


Mailing


(n=945)

1st PC


Reminder


(n=886)

Phone


Contact


Continues



Phase 6



12/20/2013

2nd QSTR


Mailing


(n=497)

Phone


Contact


Continues

1st PC


Reminder


(n=931)

2nd QSTR


Mailing


(n=747)

Phone


Contact


Continues




Phase 7




12/23/2013


Phone Contact 1st attempt





-

Phone


Contact


1st attempt




Phase 7 (Group 3)



12/30/2013


1st QSTR


Mailing


(n=834)





Phase 7 (Group 1)



1/2/2014

2nd QSTR


Mailing


(n=845)



2nd QSTR


Mailing


(n=88)

1st QSTR


Mailing


(n=852)



Phase 8



01/06/2014

Phone


Contact


Continues

1st Email


Reminder


(n=340)

2nd QSTR


Mailing


(n=848)

Phone


Contact


Continues

1st Email


Reminder


(n=381)



Phase 9



01/08/2014

Phone


Contact


Continues

1st PC


Reminder


(n=796)



-

Phone


Contact


Continues

1st PC


Reminder


(n=826)



Phase 10



01/14/2014

Phone


Contact


Continues

2nd QSTR


Mailing


(n=780)

Phone


Contact


Continues

Phone


Contact


Continues

2nd QSTR


Mailing


(n=807)



Phase 11



01/21/2014

Phone


Contact


Continues

Refusal


Mailing


(n=92)

Phone


Contact


Continues

Phone


Contact


Continues

Refusal


Mailing


(n=95)



Phase 12



01/27/2014

Phone


Contact


Continues

Phone


Contact


Continues

Phone


Contact


Continues

Phone


Contact


Continues

Phone


Contact


Continues




Phase 13




02/03/2014


2nd Email Reminder (n=270)


2nd Email Reminder (n=267)

2nd Email Reminder (n=242)


2nd Email Reminder (n=245)


2nd Email Reminder (n=334)



Phase 14



02/10/2014

Refusal Mailing (n=74)

Phone Contact Continues

Refusal Mailing (n=70)

Refusal Mailing (n=84)

Phone Contact Continues


Telephone Interviews


All groups except for Group 3 received an average of three call attempts over the twelve week data collection period; Group 3 received an average of four call attempts. These call attempts alternated days of the week and time of the day. If an interviewer called at an inconvenient time for the respondent, the interviewer would then attempt to schedule a specific time to re-contact the individual for an interview. Eastern and central time zone cases were called with a specific early morning call attempts (5am to 7am PST) to achieve an 8am to 10am to reach businesses as they are starting work for the day.


All respondents who refused to complete the telephone interview were offered the web survey option.



Table 2. ERSR10 Pilot Study Data Collection Telephone Statistics



Total

11/18/13

Data collection period ~

02/14/14

Average call length (minutes: seconds) 34:14

Number of interviewers trained (including staff) 29

Number of cases monitored 4

Number of cases spot checked 27

Percentage of completed interviews monitored 6%

Average number of call attempts 3

Completed telephone interviews per hour 0.14

Average hours to get one complete 7.17





Refusal Prevention


During the telephone interview, if a respondent refused to do the survey on the phone, they were offered an option of completing the survey online and were sent an email with the survey information and the web link to the questionnaire immediately. After examine the calling records and the cases final codes,

188 cases out of 751 refusals were converted to complete either a web or mail questionnaire. The refusal conversion rate was 25.03%.


Shape13 Table 3. Telephone Refusal Conversion Statistics



CM by





CM Per





CM Per




CM Per

Calls Refusal CM by

Phone

Mail or

Web

IE Hours

Hour

(total)

Hour

(Phone)

Hour

(Mail or

Web)

Total # 15033 751 119 188 82 854 0.36 0.14 0.22





III. CASE DISPOSITION AND RESPONSE RATES




SESRC provides two kinds of response rates for the survey: the cooperation rate and the response rate based on the American Association for Public Opinion Research (AAPOR) guidelines. These calculations are based on the operational definitions and formulas for calculating response rates, cooperation rates, refusal rates, and contact rates on www.aapor.org.


A breakdown of the response rates is given in Table 4. Sample Disposition and Response Rate on the following pages.

The cooperation rate is the ratio of completed and partially completed2 interviews to the number of completed, partially completed and those who refused to complete the survey. The formula for AAPOR cooperation rate 4 is:


Shape14 (I + P) [(I+P) +R]

Where I = number of completed interviews P= number of partially completed interviews R = number of refusals


A 70.8% cooperation rate was achieved for this pilot study as of the time this report is prepared. (72.3%


for Group 1, 71.6% for Group 2, 76.4% for Group 3, 70.6% for Group 4, and 60.7% for Group 5.)


The response rate is the ratio of completed and partially completed interviews to the total eligible sample. This formula is considered one of the industry standards for calculating response rates and complies with AAPOR Standard Definitions (American Association for Public Opinion Research) Response Rate (AAPOR

response rate 4). This calculation removes all ineligible cases from the formula. The formula is:






Shape15 2 A completed interview/questionnaire refers to a respondent answered all of the questions or most of the questions and the last question of the questionnaire. A partial completed interview/questionnaire refers to a respondent didn’t answer all the questions and broke off before reaching the last question. A case is considered a partial completed case if at least the first three questions were answered.


(I + P)

Shape19 [(I+P) +(R+NC+O] +e (UH+UO)] Where I = number of completed interviews


P= number of partially completed interviews


R= number of refusals


NC = number of non-contacts


O= other


UH= unknown household


UO= unknown other e= eligible


A 28.4% response rate was achieved for this study as of the time this report is prepared. (30.6% for Group


1, 29.3% for Group 2, 32.1% for Group 3, 31.1% for Group 4, and 19.0% for Group 5.)


Table 4. Sample Disposition and Response Rates



Group 1 Group 2 Group 3 Group 4 Group 5


Overall

Mail Telephone Web All Control

First First First Options Group

Eligible, Interviewed







Phone Complete (CM)

12

51

10

17

29

119

Web Complete (WC)

106

121

180

104

93

604

Mail Complete (MC)

176

101

114

177

55

623

Ineligible Mail Complete

4

1

6

7

0

18

Web Short Version Complete

0

0

0

3

0

3

Web Short Version Partial

3

7

1

0

5

16

Web Partial Complete

17

28

33

23

24

125

Eligible, non-interview







Refusal and break off

111

108

92

119

118

548

Web refusal

3

3

2

5

0

13

Non-Contact (CB, EB, EM, GB, HB, MB, SB, SG, SH, WB)


186


165


186


181


210


928

Respondent unavailable (RN)

5

3

9

13

12

42

Answering Machine (AM, SM)

204

218

207

191

249

1069

Answering Machine Left Message (LM, SL)

50

63

38

60

64

275

Physically or mentally unable (DF, HC)

1

0

0

0

1

2

Language problem (LG, LS)

1

0

0

0

0

1


Unknown eligibility, non-interview







Always busy (BZ, SZ)

4

5

7

1

5

22

No answer (NA, SA)

26

46

29

31

38

170

Call blocking (BC, SC)

6

2

2

3

3

16

Return to sender

46

39

45

34

48

212

Not eligible







Fax/data line (ED, SD)

8

2

5

7

8

30

Disconnected number (DS)

7

6

9

11

17

50

Temporarily out of service (CC)

6

6

3

1

2

18

Wrong Number (WN)

4

6

6

5

13

34

Missing Phone Number (MP)

9

7

3

7

3

29

Company has less than 5 employees (IE)

14

8

19

21

12

70

Business does not operate in the USA (I4)

0

0

0

0

1

1

Company no longer in business (I3)

6

13

4

6

2

31

Company Policy to not do surveys (CP)

24

29

33

22

27

135



Shape20 Table 4 Continued



Group 1 Group 2 Group 3 Group 4 Group 5

Mail Telephone Web All Control Overall

First First First Options Group

Other (OT)

8

5

6

2

4

25

Duplicate (DP)

0

3

0

1

0

4

Total Sample

1041

1043

1042

1041

1042

5209

I=Complete Interviews to Full Survey

294

273

304

298

177

1346

P=Partial Interviews to Full Survey

17

28

33

23

24

125

R=Refusal and break off

114

111

94

124

118

561

NC=Non Contact

445

449

440

445

535

2314

O=Other

2

0

0

0

1

3

UH=Unknown Households

82

92

83

69

94

420

UO=Unknown Other (Mail/Web only)

46

39

45

34

48

212

eligible

0.87

0.88

0.87

0.88

0.86

0.87

Response Rate 1


29.80%


28.10%


31.40%


30.70%


18.10%


27.60%

I / (I + P) + (R + NC + O) + (UH + UO)

Response Rate 2


(I + P) / (I + P) + (R + NC + O) + (UH + UO)


30.10%


28.80%


31.50%


30.70%


18.60%


28.00%

Response Rate 3


30.30%


28.60%


32.00%


31.10%


18.50%


28.10%

I / ((I + P) + (R + NC + O) + e(UH + UO))

Response Rate 4


(I + P) / ((I + P) + (R + NC + O) + e(UH + UO))


30.60%


29.30%


32.10%


31.10%


19.00%


28.40%

Cooperation Rate 1


71.20%


69.80%


76.20%


70.60%


58.80%


69.90%

I/(I+P)+R+O)

Cooperation Rate 2


(I+P)/((I+P)+R+0))


71.90%


71.60%


76.40%


70.60%


60.50%


70.70%

Cooperation Rate 3


71.50%


69.80%


76.20%


70.60%


59.00%


70.00%

I/((I+P)+R))

Cooperation Rate 4


(I+P)/((I+P)+R))


72.30%


71.60%


76.40%


70.60%


60.70%


70.80%

Ineligible Rate

1.92%

2.01%

2.21%

2.59%

1.44%

1.96%




IV. SURVEY RESULTS




Completes by Mode


There are 623 mail completes, 604 web completes, and 119 phone completes out of the pilot study sample (5210 cases). The majority (91%) of the completes came from either mail or web questionnaires, with only 8.8% from telephone interviews. (See Table 5. Number of Completes by Mode. )


Table 5. Number of Pilot Study Full Survey Completes by Mode


Mode Frequency Percent

Mail Completes 623 46.3%

Web Completes 604 44.9%

Phone Completes 119 8.8%

total 1,346 100%




Group 3 has the most completes (n=304) overall, followed by Group 4 (n=298), Group 1 (n=294), Group 2 (n=273), and Group 5 (n=177).


Table 6. Number of Pilot Study Full Survey Completes by Groups


Web Phone Total

Groups Mail Completes Completes Completes Completes

(#/%) (#/%) (#/%)


Mail First

Group 1

176 (13.1%) 106 (7.9%) 12 (0.9%) 294 (21.8%)

Group 4

177 (13.2%) 104 (7.7%) 17 (1.3%) 298 (22.1%)

Web First Group 3

114 (8.5%) 180 (13.4%) 10 (0.7%) 304 (22.6%)


Phone First

Group 2

101 (7.5%) 121 (9.0%) 51 (3.8%) 273 (20.3%)

Group 5

55 (4.1%) 93 (6.9%) 29 (2.2%) 177 (13.2%)

Overall

623 (46.3%) 604 (44.9%) 119 (8.8%) 1,346 (100%)



When looking at the number of completes by mode, Group 4 and Group 1 have the most mail completes (n=177 and n=176 respectively) while Group 5 has the least mail completes (n=55). Group 3 has the most web completes (n=176) while Group 5 has the least web completes (n=93). Group 2 has the most telephone completes (n=51) while Group 3 has the least telephone completes (n=10).


Table 7. Most and Least Pilot Study Full Survey Completes Mode vs. Groups


Mode

Most Completes (#)

Least Completes (#)

Mail Completes

Group 4 (n=177) & Group 1 (n=176)

Group 5 (n=55)

Web Completes

Group 3 (n=180)

Group 5 (n=93)

Phone Completes

Group 2 (n=51)

Group 3 (n=10)





Response Burden



Table 8 displays the number of web completes by treatment group, the average time of completion, and the median time of completion after correction for timed out or error cases. The adjusted average time of a web complete ranges from 22:45 minutes to 27:47 minutes. It should be noted that the telephone first group has less time on average and at the median. Telephone group respondents may have completed a small portion of the survey by telephone and elected to abandon the telephone interview and finish the survey over the web. The median time was 23:41 minutes and this is the value where half of the responses are less than this value and half are greater than this value.


Table 8. Pilot Study Full Survey Web Completion and Completion Time Statistics


Group

#


completes

Average time

Median times

Max. time

Min. time

Group 1 - Mail First

106

0:24:19

0:23:21

0:48:56

0:10:11

Group 2 - Telephone First

121

0:22:45

0:22:23

0:43:19

0:11:02

Group 3 - Web First

180

0:27:47

0:26:02

1:24:56

0:10:12

Group 4 - All Options

104

0:25:01

0:24:31

1:07:54

0:12:20

Group 5 - Control Group

93

0:24:49

0:22:52

0:46:50

0:17:21

Total

604

0:25:47

0:23:41

1:24:56

0:10:11




A short web questionnaire with only nine questions was developed for the telephone refusal cases. The goal of this short questionnaire is to obtain essential information from respondents who refused to complete the telephone interview. The refusals from telephone contacts were sent a postal letter with the web link to this short web questionnaire close to the end of the data collection inviting them to complete this 5 minute survey.


Table 9. Pilot Study Short Survey Web Completion and Completion Time Statistics


Group

#


completes

Average


time

Median


times

Max.


time

Min.


time

Group 1 - Mail First

3

0:03:34

0:04:01

0:04:37

0:02:04

Group 2 - Telephone First

7

0:05:40

0:03:29

0:20:31

0:01:56

Group 3 - Web First

1

0:02:38

0:02:38

0:02:38

0:02:38

Group 4 - All Options

0

-

-

-

-

Group 5 - Control Group

5

0:03:12

0:03:01

0:03:43

0:02:42

Total

16

0:04:18

0:03:18

0:20:31

0:01:56



It takes 7.17 hours (see Table 2) on average to get one telephone interview and a completed interview averages about 34:14 minutes; while it takes about 25:47 minutes to complete a full web questionnaire and 4:18 minutes to complete a short web questionnaire. (See Table 8. Pilot Study Full Survey Web Completion and Completion Time Statistics. )


There was no information on the average time to complete a mail questionnaire therefore the response burden was not calculated for the mail questionnaire.


Nonresponse Burden


The average length of refusal cases was recorded in Table 10. On average, all groups except for Group 4 receives three call attempts during the data collection period. Group 3 receives an average of four call attempts. One last call attempt was added for Group 3, which had the highest response rate, closer to the end of data collection in order to increase response rate. The overall telephone nonresponse burden is

0.243 hours across all groups.


Table 10. Telephone Nonresponse Burden


Refusals length seconds hour

1st attempt 128.53 0.036

2nd attempt 157.57 0.044

3rd attempt 152.01 0.042

4th attempt 119.78 0.033

On average RP length 141.81 0.039

Sum of 3 attempts 438.11 0.122

Sum of 4 attempts 557.89 0.155




V. Pilot Study Final Results




Sample


The pilot study survey results of all completes and partial completes to the full interview for the two sample source groups (BLS and 1996 RBUS sample sources) showed almost equally split proportions in the results, 52.22% 1996 and 47.78% BLS sources respectively. Table 11 shows that there was no significant differences of the number of completes across the five groups. Overall, both 1996 and BLS sample groups were responding comparatively to the survey treatments as indicated by the percentages of response in each group.


Table 11. Number of Pilot Study Full Survey Completes and Partial Completes by Sample

Source


GROUP

SOURCE

Frequency Percent Row Pct Col Pct

1996

BLS

Total

Group 01 -- Mail First

159

10.85

51.29

20.78

151

10.31

48.71

21.57

310

21.16

Group 02 -- Telephone

First

154

10.51

51.51

20.13

145

9.90

48.49

20.71

299

20.41

Group 03 -- Web First

172

11.74

51.19

22.48

164

11.19

48.81

23.43

336

22.94

Group 04 -- All Options

168

11.47

52.50

21.96

152

10.38

47.50

21.71

320

21.84

Group 05 -- Control

Group

112

7.65

56.00

14.64

88

6.01

44.00

12.57

200

13.65

Total

765

52.22

700

47.78

14653

100.00














Shape25 3 This final number does not include 6 missing cases. The total number of completes and partial completes to the full survey is 1471. However, xix respondents ripped off their IDs on the paper questionnaire so we cannot link them back to their sample information therefore they were not included in this table.






NAICS Code Coverage




The occurrence of all NAICS codes included for this survey is shown in Table 12 at the 2 digit level. The Wholesale trade (42) has the highest response rate (19.18%) in the BLS sample and Manufacturing (33.58%) has the highest response rates overall. The only NAICS sectors demonstrating serious nonresponse problems are Agriculture (11) and Other services (81) included in the overall sample. However, these sectors are not included in the study population for the main study.



Table 12. Number of Completes by NAICS Codes4


































Shape26 4 NAICS: 11 Agriculture, 21 Mining; 31 Manufacturing; 32 Manufacturing; 33 Manufacturing; 42 Wholesale

Trade; 48 Transportation; 51 Information; 52 Finance and Insurance; 54 Professional/Technical Services; 55

Management of Companies an Enterprises; 71 Arts, Entertainment and Recreation; 81 Other Services.


NAICS2

SOURCE

Frequency Percent Row Pct Col Pct

1996

BLS

Total

11

4

0.27

100.0

0

0.52

0

0.00

0.00

0.00

4

0.27

21

2

0.14

12.50

0.26

14

0.96

87.50

2.00

16

1.09

31

134

9.15

82.72

17.52

28

1.91

17.28

4.00

162

11.06

32

209

14.27

75.18

27.32

69

4.71

24.82

9.86

278

18.98

33

407

27.78

83.40

53.20

81

5.53

16.60

11.57

488

33.31

42

0

0.00

0.00

0.00

128

8.74

100.0

0

18.29

128

8.74

48

0

0.00

0.00

0.00

65

4.44

100.0

0

9.29

65

4.44

51

4

0.27

10.00

0.52

36

2.46

90.00

5.14

40

2.73


52

0

0.00

0.00

0.00

25

1.71

100.0

0

3.57

25

1.71

54

2

0.14

1.48

0.26

133

9.08

98.52

19.00

135

9.22

55

0

0.00

0.00

0.00

87

5.94

100.0

0

12.43

87

5.94

71

0

0.00

0.00

0.00

34

2.32

100.0

0

4.86

34

2.32

81

3

0.20

100.0

0

0.39

0

0.00

0.00

0.00

3

0.20

Total

765

52.22

700

47.78

14655

100.0


Table 13 shows the number of completes and partial completes to the full survey by establishment size from the BLS sample only. The larger establishments with more than 100 employees are underrepresented. 6.8% of the BLS sample for the pilot study was larger establishments but only 4.3% completed the pilot study.




Table 13. Number of Completes and Partial Completes to the Pilot Full Survey by

Establishment Size


Establishment size


SOURCE


Frequency Percent Row Pct Col Pct


1996

BLS

Total


5

168

11.47

26.21

21.96

473

32.29

73.79

67.57

641

43.75


20

357

24.37

64.44

46.67

197

13.45

35.56

28.14

554

37.82


100

240

16.38

88.89

31.37

30

2.05

11.11

4.29

270

18.43

Total

765

52.22

700

47.78

14656

100.0




Table 14 provides the distributions of establishment sizes in the 1996, BLS, and the overall sample frames well as their respective distributions exhibited in the pilot study results. One concern we have is that larger businesses are not responding well in any of the protocols and at a much lesser rate than


was experienced in the 1996 RBUS survey and at a lower rate for the BLS sample compared to their distributions in respective sample frames.


Table 14. Comparison of Pilot Study Sample Frame to Completion by Business Size



Pilot Sample Frame

Composition

Pilot Study Completes

Overall 1996 BLS

Overall 1996 BLS

5 43.34% 17.18% 66.58%

44.14% 22.42% 67.92%

20 34.57% 43.29% 26.82%

37.42% 46.11% 27.89%

100 22.09% 39.53% 6.60%

18.44% 31.46% 4.18%

Number 5297 2451 2759

1164 785 717





Table 15 shows the number of completes by metro or non-metro areas from the BLS sample only. The percentage of businesses from metro or non-metro areas matches the focus of the study which is the rural area.


Table 15. Number of Pilot Study Completes by Metro/Non Metro Area for BLS Sample


Sample composition Pilot Completes

#(%) # (%)

Non Metro 1,805 (65.39%) 529 (73.78%)

Metro 955 (34.61%) 188 (26.22%)

Total 2759 717




Telephone Prescreening




868 cases or 64% of the completed surveys in the pilot study had completed telephone pre-screening,


481 cases or 35% had other dispositions (see Table 16). These other results included 364 cases or


27% with an eligible but non-interview disposition (e.g. refusals, call backs, Language problem, answering machine, etc.), 49 cases or 3.6% were unknown eligibility (e.g. always busy, no answer, call blocking, etc.) and non-interview, and 65 cases or 4.8% were not eligible cases (e.g. fax line, disconnected numbers, temporary out of service, wrong number, missing phone number, etc.) from the telephone prescreening.


Table 16. Pilot Study Completes Only vs. Prescreening Final Results


Prescreening Results

Number

Percent

Completes or partial completes

868

64.48%

Eligible, non-interview

364

27.04%

Unknown eligibility, non-interview

49

3.64%

Not eligible

65

4.82%

Total

1346

100%


Telephone prescreening can be helpful in getting updated contact information and a successful prescreening contact would provide a contact persons name and an email address. Of those completing the prescreening, 81% provided a contact name (See Table 17).


Table 17. Pilot Study Prescreening Updates Summary



Sample Type



BLS


1996


Overall




n %


n %


n %

Item

List


Business name


-


-


345 13.0%


Phone number


-


-


850 31.9%


Email


470 38.9%


295 20.3%


765 28.7%


URL


377 31.2%


289 19.9%


1891 71.0%


Contact name


961 79.5%


1196 82.3%


2157 81.0%


Contact title


983 81.3%


1149 79.1%


2132 80.1%


Address


-


-


1042 39.1%


Total Number


1209 100%


1453 100%


2662 100%







Refusal Short Form Letter




A refusal letter with a link to a shortened questionnaire was sent to the respondents from all groups who refused to participate during the telephone contacts at the end of the data collection. Out of the

415 letters sent, only 15 completed the shortened questionnaire. The response rate is 3.6%.


Contact Sequence




Table 18. Response Rate7 History for All Groups


Date

Group 1

Group 2

Group 3

Group 4

Group 5

Overall

11/22/2013

-

-

6.4%

-

-

-

11/27/2013

1.90%

5.20%

8.10%

2.40%

0.10%

3.50%

12/03/2013

3.40%

6.15%

8.80%

4.00%

0.50%

5.99%

12/09/2013

9.50%

6.65%

9.10%

10.50%

4.15%

7.33%

12/20/2013

12.90%

9.30%

13.50%

15.80%

7.97%

11.92%

01/03/2014

17.00%

9.97%

20.40%

20.20%

8.16%

15.04%

01/08/2014

19.74%

12.76%

21.53%

22.40%

9.59%

17.12%

01/16/2014

24.09%

15.29%

24.79%

25.40%

12.38%

19.67%

01/24/2014

26.38%

21.59%

28.79%

27.77%

13.78%

23.68%

02/03/2014

28.56%

26.69%

30.87%

29.85%

17.98%

25.31%

02/10/2014

29.95%

28.78%

32.16%

31.82%

19.18%

28.26%

02/17/2014

30.6%

29.3%

32.1%

31.1%

19.0%

28.4%






























Shape34 7 Note that the response rate history uses raw response rates calculated during data collection period. A fully examined disposition was done after the data collection is completed. During the data collection period, the raw response rate didnt calculate the potential non Partial-Completed (PC) cases and no data correction was done.

Shape35

Chart 1. Response Rate History by Group


Response Rate History by Groups






100%


90%


80%


70%


60%


50%


40%


30%


20%


10%


0%


Group 1 Group 2 Group 3 Group 4 Group 5 Overall




After the prescreening stage, those protocols that were mail first in sequence (Groups 1, 3, 4--30.6%,


32.1% and 31.1% respectively) were outperforming the telephone first sequences groups (Group 2 and Group 5--29.3% and 19.0%). These groups brought more completed questionnaires into the study early on.


The two highest response rate groups are Group 3 (32.1%) and Group 4 (31.1%). The commonalities between these two groups were that they used token cash incentives early on in the mailing protocol and both protocols used cash incentives twice. Group 3 used a token cash incentive, $2, in the pre- notice letter and again in the first questionnaire mailing combined with two day priority postage mailing. The second questionnaire mailing for Group 3 did not include an incentive and was sent via first class post. The distinguishing characteristic for Group 4 was the treatment of using token cash incentive combined with two day priority mail post (higher class postage and packaging) two times in the questionnaire mailings. Group 4 also had the most mail completes compared to the other groups.


Group 1 had the third highest response rate (30.6%) and was only 0.5% lower than Group 4. Group 1 and Group 4 had almost identical contact sequences except that the first questionnaire mailing for


Group 3 was sent by First class mail instead of two day priority postal mail. In preliminary results, the number of mail completes and web completes between Group 1 and Group 4, were almost identical with 176 mail completes and 106 web completes from Group 1 and 177 mail completes and 104 web completes from Group 4. (See Table 6.)


Although Group 4 had the second highest response rate, the cost of this protocol is more expensive compared to Group 1 with an almost identical outcome. The use of two two-day priority mailing protocol did not appear to have a significant impact on the number of completes compared to the use of first class postage questionnaire mailing with a onetime use of two day priority mailing questionnaire mailing protocol. Chart 2 shows the response rate comparison between group 1 and Group 4 through February

2014.



Shape36 Chart 2. Response Rate Comparison between Group 1 and Group 4





35.0%

30.0%

25.0%

20.0%

15.0%

10.0%

5.0%

0.0%

Group 1 and Group 4 Response Rate Comparison







Group 4 Group 1




These outcomes are consistent with previous findings in the establishment survey literature. Cash incentives are most effective when delivered early in the survey contacts. Cash incentives when attached to questionnaire mailings are more effective. Although the literature shows that cash incentives combined with higher class postage, priority mail is more effective, it only made a difference of less than 3% compared to cash incentives combined with First class mailing in the pilot study. Looking at the response rate history between Group 1 and Group 4, Group 1 had a response rate of

12.9% and Group 4 15.8% on the same day the second questionnaires were sent. When looking at the


final response rate between Group 1 and Group 4, Group 4 was only leading by less than 1%. The protocol with priority mailing is 6.5 times more expensive compared to a first class mailing (Priority:

$5.60 per case vs. First Class: $0.91 per case).



Interestingly, Group 3 with the $2 incentive pre-notice letter with web link followed quickly by an email reminder with a clickable link and access code (email augmentation) had almost double the number of completes by web compared to the other groups. Group 3 had a response rate of 13% just by sending out a pre-notice letter and an email reminder to the non-respondents with an email address, which was about 30% of the total Group 3 sample. This finding suggests that a pre-notice letter with an incentive and providing an email augmentation is very effective if an email is available and has a cost saving advantage as this strategy drives respondents to the web and thus reduces questionnaire mailing postage costs and data entry costs and the follow-up telephone interviewing costs.


The overall treatment groups as of March 2014 have evened out and are relatively equal with the number of completes across groups. The only group that had a lower response rate was Group 5 which did not include any mailings with cash incentives and were sent via First Class mailing.



VI. SESRCs RECOMMENDATIONS FOR THE FULL STUDY




1. Increase sample size



Due to the lower response rates from the pilot study results, SESRC recommends adding more sample for the full study in order to meet the study goal of 17,000 completes. A starting sample size of 60,000 was proposed for the full study under the assumption that the response rate for the full study will be similar to pilot study results, around 33% with an ineligible rate of 3.2%.


2. Oversample large establishments



SESRC recommends oversampling large businesses for the BLS sample due to their low response rate in the pilot study.


3. Prescreen the proprietary sample only


Since the pool of successfully prescreened establishments had a cooperation rate of 100% by definition, the higher response rate for cases with an identified contact merely reflects this higher cooperation rate than for the sample as a whole. In addition, the assumption in the original plan that phone intensive contact would generate a large share of completes and significantly reduce nonresponse did not prove to be the case. The effective strategy suggested by the pilot study is a much larger mail/web distribution, phone contact to complete a small share of surveys and more effectively direct respondents to the web, and limited ability of repeated phone contacts to significantly reduce nonresponse. The value of telephone prescreening in pilot only marginally improved completion rates, and that this marginal benefit will be substantially reduced in the mode sequence adopted in the full study that does not rely on phone first contact. Given the increase in time burden and cost burden for prescreening, and the doubling of the sample size it is recommended that prescreening not be done for the BLS sample.


However, eligibility will be a concern for that part of the sample in the main sample drawn from a proprietary sample frame. SESRC did not have the SSI sample in the pilot study so the prescreening effort will provide experience with this new type of sample frame for the full study.


Therefore SESRC recommends to prescreen proprietary sample (commercial SSI) for the full study.


4. Change Incentives to $1 instead of $2



SESRC recommends changing the incentive amount from the originally approved two payments of $2 to two payments of $1 in order to compensate for the larger sample size. The proposed full study will start with an advance letter including a survey web link and a $1 incentive. The goal is to have as many respondents completing the web survey prior to sending out the paper questionnaires. Another $1 incentive will be included with the first questionnaire mailing. By reducing the incentives from $2 to $1, it will allow all sample cases to receive incentives in the advance letter and the 1st questionnaire mailing, if necessary.


5. Use web first (Group 3) contact sequence for full study


SESRC recommends using procedures (Group 3 Web First) from the pilot study that obtained the highest response rate. Based on the pilot study results, the best way to combine the most effective and cost effective elements are shown in Table 19.


To offset the cost for the extra sample (30,000) needed for the full study, it is anticipated that the remaining budget won’t be enough to cover two day priority mail postage for all cases in the full study. If obtaining 17,000 completes is necessary, the cost saved from reducing incentives and high cost postages across the entire sample will help make it possible to increase the sample size. The priority treatment did not make a significant difference in response compared to the other treatment groups during the pilot study, we recommend the use of first class postage and not to use priority class mail postage.


Telephone will be used early to prescreen the proprietary (SSI) sample and to contact businesses without a sufficient address in the sample. Telephone reminders will be conducted after the two questionnaire mailings are done, followed by an email reminder to non-respondents with an email address obtained from the previous telephone contacts. A refusal conversion letter to telephone refusal cases inviting them to complete the short web survey will be sent at the end of data collection as the last push to increase response rate.


Table 19. Contact Sequence for the Full Study



SSI Sample

n=3,619

BLS Sample

N=56,381

Prescreen

Telephone contact for SSI sample

Phase 1

Advance Letter with survey web link & $1

Phase 2

1st Questionnaire with survey web link and $1 via First Class

Phase 3

Thank you postcard

Phase 4

2nd Questionnaire with survey web link via First Class

Phase 5

Telephone contact

Phase 6

Email reminder

Phase 7

Refusal short form letter



6. Use the same survey instrument



The results from the Pilot study showed the current questionnaires worked well for all three modes. It is SESRCs recommendation that the full study should maintain the same questionnaire for all modes (web, mail and telephone). SESRC recommends adding one question (see Q52) at the end of the questionnaire for the full study-asking if the respondent agrees to be contacted in the future if we have questions regarding their answers. This is commonly done in business surveys and will serve as a precaution to help ensure the data quality if any data was in doubt after collected.

Shape42



Shape43 Shape44 50. What is your gender?


01 Male

O, Female


51. How long have you worked at this business?


number of years worked


52. Could we contact you again in the future if we have questions or need additional information about your answers?

0 Yes, by email -+ Email address

0 Yes, by phone -+ Phone number

0 Yes, by mail -+ Mailing address

0 No


Shape45 Shape46 Shape47 53. If you have any additional comments about this survey or innovation in general, please write them in the box below.

























Shape48 Thank you!!

Please return your completed questionnaire in the envelope provided or to:


National Survey of Business Competitiveness Social & Economic Sciences Research Center Washington State University

PO Box 641801

Pullman, WA 99164-1801


Shape49 - 16-






VII. SURVEY RESULTS




The frequency listings of the pilot study survey a result is included in Appendix A. The open-ended comments is included in Appendix B.

The most relevant frequency listings to the research objectives of the main study are the self- reported innovations rates from Question 27, reproduced below:



In the past 3 years, did this business produce any new or significantly improved goods?


Cumulative


Cumulative

Q27A Frequency Percent Frequency Percent


ƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒ

Don't know 1 . . . Missing 17 . . . Not Applicable 178 . . . Skipped 129 . . . Yes 872 73.71 872 73.71

No 311 26.29 1183 100.00

Frequency Missing = 325


In the past 3 years, did this business provide any new or significantly improved services?

Cumulative

Cumulative


Q27B Frequency Percent Frequency Percent


ƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒ

Don't know 2 . . . Missing 20 . . . Not Applicable 163 . . . Skipped 129 . . . Yes 739 61.89 739 61.89

No 455 38.11 1194 100.00

Frequency Missing = 314


In the past 3 years, did this business introduce new or significantly improved methods of manufacturing or producing goods or services?


Cumulative

Cumulative


Q27C Frequency Percent Frequency Percent


ƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒ

Don't know 1 . . . Missing 19 . . . Not Applicable 234 . . . Skipped 129 . . . Yes 685 60.89 685 60.89

No 440 39.11 1125 100.00

Frequency Missing = 383


The relatively high innovation rates owing either to social desirability bias or differences in the interpretation ofnew or significantly improved was anticipated in the Supporting Statement. We examine the likely effectiveness of auxiliary questions to differentiate substantive innovators from nominal innovators by computing associations amongst these variables and performing a preliminary cluster analysis. If the auxiliary variables are only weakly associated and/or if the cluster analysis is unable to identify highly distinctive groups then it is unlikely that the latent class analysis used in the main study will be able to identify distinct subpopulations in the sample that are useful for differentiating innovators.


Descriptive statistics for the relevant items of the auxiliary questions are reproduced below:












Shape52

Variable


Q13A_r

N


1508

Mean


0.49735

Std Dev


0.50016

Label


Training requirements documented?

Q13B_r

1508

0.44761

0.49741

Track training completions?

Q16_r

1508

0.44430

0.49705

Use computers on a daily basis?

Q24_r

1508

0.45557

0.49819

Document good work practices?

Q25_r

1508

0.42241

0.49411

Monitor customer satisfaction?

Q26_r

1508

0.51658

0.49989

Processes changed customer


Q28A_r


1508


0.21883


0.41359


Innovation activities abandoned?

Q34D_r

1508

0.23276

0.42273

Fund additional innovation projects

Q37D_r

1508

0.25199

0.43430

Trade secret protections


complaints?







Questions Q13A through Q26 are indicators of the extent to which data drives decision-making within the establishment. The relatively high share of establishments answering these questions affirmatively suggests that these variables by themselves may not be effective in identifying the subset of substantive innovators among self-reported innovators. In contrast, not more than a quarter of respondents answered any of the last three questions in the list affirmatively. In addition, all three questions are thought to have a much more direct link to substantive innovation activities within the establishment. We will have more faith in the potential value of the data driven decision- making variables (Q13A-Q26) in differentiating substantive from nominal innovators if they tend to be strongly correlated with Q28A, Q34D or Q37D


All the associations between the listed auxiliary variables are significant. The strength of these associations are demonstrated for Q34D (would surplus funds be used to fund additional innovation projects) with the data-driven decision making variables. The strength of the association is most easily interpreted as an odds ratio from the estimation of relative risk between pairs of binary

variables. Respondents answering Q34D affirmatively would be twice as likely to answer the data- decision making variables affirmatively with the exception of Q16.



Associations with Q34D



Q13A Q13B Q16 Q24 Q25 Q26

Tetrachoric Correlation 0.2511 0.2475 0.174 0.3528 0.3311 0.3614 (0.0427) (0.0427) (0.0437) (0.0407) (0.0412) 0.0408



Odds Ratio Lower 95th 1.5787 1.5625 1.2704 2.1298 1.9939 2.1948

Odds Ratio Upper 95th 2.5799 2.5359 2.0536 3.4985 3.2516 3.6711




Disjoint cluster analysis provides a rough analogue to the latent class analysis to be used in the main study. Applied to the pilot data the cluster analysis demonstrates that the set of variables is effective in differentiating observations. For the disjoint cluster analysis we set the maximum number of cluster to four to parallel the four latent classes we anticipate observing in the data: 1) data driven non-innovators, 2) nominal innovators, 3) non-innovators, and 4) substantive innovators. Examining the means of cluster variables across clusters helps interpreting cluster membership. Cluster 3 is the largest cluster and cluster means suggest that both data-driven decision-making and the other indicators of substantive innovation are very rare within this cluster. Cluster 3 corresponds closely to the archetype of a non-innovating establishment. In contrast, establishments in cluster 4 overwhelmingly pursue data driven decision-making practices and are the group most likely to demonstrate other substantive innovation behaviors. Cluster 4 comprises about 21% of the sample which is close to the identification of 24% of manufacturing firms being highly innovative” in a qualitative study of rural English firms (David and Smallbone 2000). Differentiation of the remaining two clusters is not as clear cut but this exploratory analysis supports the classification of Cluster 1 establishments as data driven non-innovators and cluster 2 as nominal innovators.


Cluster Means


Cluster Q13A_r Q13B_r Q24_r Q25_r Q26_r Q28A_r Q34D_r

Q37D_r


ƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒ ƒƒƒƒƒƒƒƒƒƒ

1 1.000000000 0.915211970 0.568578554 0.371571072 0.416458853 0.119700748 0.221945137

0.147132170

2 0.040000000 0.000000000 0.370000000 0.613333333 0.866666667 0.270000000 0.326666667

0.300000000

3 0.034907598 0.000000000 0.193018480 0.041067762 0.127310062 0.127310062 0.075975359

0.092402464

4 1.000000000 0.962500000 0.793750000 0.887500000 0.906250000 0.434375000 0.396875000

0.581250000




Cluster Summary




Maximum Distance


RMS Std

from Seed

Radius

Nearest

Distance Between

Cluster

Frequency

Deviation

to Observation

Exceeded

Cluster

Cluster Centroids

ƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒƒ

1

401

0.4003

1.7717


4

0.9758

2

300

0.4081

1.7264


3

1.0625

3

487

0.2993

1.7810


2

1.0625

4

320

0.3864

1.7128


1

0.9758



The use of auxiliary questions to differentiate substantive innovators from nominal innovators is critical to deriving valid comparisons of rural and urban innovation rates. The strategy is also novel in innovation surveys. Cognitive interviewing confirmed that these auxiliary questions were easily understood by respondents. Preliminary analysis of these pilot data provides assurances that the auxiliary questions will be effective in differentiating self-reported innovators. The significant advantage that latent class analysis has over cluster analysis is that class membership is probabilistic in contrast to cluster membership that is distinct, allowing more flexibility in arriving at the class size which best captures the phenomenon of interest. The formidable task in the main study will be providing external validation of the latent class structure.


REFERENCES




North, D. and Smallbone, D. 2000. The innovativeness and growth of rural SMEs during the


1990s,” Regional Studies 34(2):145-157.



VIII. Full Study Survey Instrument




Proposed Advance Letter for Full Study








WASHINGTON STATE

rjUNIVERSITY

April XX, 2014



Social and Economic Sciences Research Cent

Shape54

«CONTACT>>

« BNAME»

«ADDRln «UNIT"

«CITY>l «STATE" <<ZIP>><<dash>><<ZIP4"


Dear «CONTACT":

We are writing to let you know that the Economic Research Service of the U.S. Department of Agriculture has asked us to contact you for an important national study of businesses. The goal of this study is to increase our knowledge on how businesses stay effective and what types of things can help businesses meet new needs that arise.


The Department of Agriculture provides many programs aimed at helping all types of businesses throughout the country but they would like to do more. We hope this study helps government understand how it can be helpful. It is critical to understand the link ages of what keeps businesses vital and the availability of resources.


To complete the survey type this web page address in your Internet browser's address bar

(not the Google or Yahoo search bar), and then type in the following access code:


http:/lopinion.wsu.edu/business]014/ Access Code:«RESPID"


We hope you will take the time to complete this important survey. Gaining a full understanding of the situation U.S. firms are facing in today's economy depends upon you and others like yourself. Your responses will be kept strictly confidential and your name will not be connected to your answers in any way.


If you have any questions about this effort, or would prefer to participate by telephone please feel free to contact us at 1-800-833-0867 or scsrcwc[email protected].


Thank you in advance for your help. We appreciate it very much. A small token of appreciation is enclosed with this letter as a way of saying thank you.


Sincerely,




Danna L. Moore Ph.D. Principal Investigator





Re drth and Administrativt Off ees. 133 Wilson.Short Hall

PO 8o• 6«014, Pu lm.on, WA 99164 4014 509 335·1511 F.. 509-335-0116

Public Opinon laboratory, 1615 NE Eastgate Blvd.Sed 1on F

P08o•641801 P"llm•n,WA99!64·1801 S09-335-17l1• Fax 509-335-4688


























Social & Economic Sciences Research Center


Washington State University


P.O. Box 644014


Pullman, Washington 99164-4014


Telephone: (509) 335-1511 Fax: (509) 335-0116 http://www.sesrc.wsu.edu sesrc@wsu.edu

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWIN31TONT40
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy