0651-0065_GIPA_Survey_Part_B_2014_May

0651-0065_GIPA_Survey_Part_B_2014_May.doc

Global Intellectual Property Academy (GIPA) Surveys

OMB: 0651-0065

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT

UNITED STATES PATENT AND TRADEMARK OFFICE

Global Intellectual Property Academy (GIPA) Surveys

OMB Control Number 0651-0065

(May 2014)


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


  1. Universe and Respondent Selection


The respondent universe for these surveys will be foreign government officials and domestic intellectual property attorneys who participate in GIPA programs. The intent is to collect information on the effectiveness of the programs. The USPTO gathers information from participants in GIPA programs. As a result, the potential respondent universe for a given assessment, performance measurement, satisfaction study, or evaluation is highly dependent upon the nature and purpose of the specific GIPA program.


The USPTO evaluates each GIPA program (course) separately. Because each program typically involves participants from many countries over the duration of the fiscal year, the first step in defining the respondent universe is for the GIPA program officers to consult with the particular program managers and U.S. Embassies to determine the scope and make-up of each course; i.e., the countries from which the respondents come and the time frame. The scope of the GIPA courses changes on a course-by-course basis, and so participant tracking is essential. For each survey (pre-program, post-program, and alumni survey), because the total number of participants in any given year does not differ significantly, a census approach (surveying the entire population by course/by fiscal year) will be conducted. Since the total universe is known, the USPTO can track respondent distribution to ensure that the respondent pool is proportional and representative of the entire universe.


Table 1 shows the total universe of GIPA participants since 2008.


Table 1: Total GIPA Participants Since 2008


Fiscal Year


Total Number of GIPA Participants

2013

3,984

2012

4,034

2011

4,217

2010

4,500

2009

4,267

2008

4,548


The respondent pool for these surveys is currently comprised of all English-speaking individuals who participated in the USPTO Global Intellectual Property Academy sessions. This universe was selected to administer the surveys and develop the system of delivery, data collection, analysis, and reporting.


The size of the respondent universe in past GIPA training evaluations, performance measurement and management studies has varied from 300-500 participants. As the USPTO conducts these surveys, the agency will investigate the relationship between the size of the respondent universe/population surveyed and response rates.


Sampling Method


At this time, the sampling method for these surveys is limited to all of the English-speaking foreign officials who participate in the GIPA training programs or domestic intellectual property attorneys of small to medium-size businesses. In essence, the sample for these surveys is the same as the respondent pool. As the respondent pool for the survey expands, the USPTO fully expects the sample to expand as well. In addition, the USPTO anticipates expanding the respondent universe to all participants and to develop foreign language versions of the surveys in future iterations.


Response Rates


Response rates for the alumni pilot questionnaire delivered during its use range from 38.5% to 100%.


A factor affecting survey response rates for studies is the touch point (pre-program, post-program, and alumni surveys). The USPTO expects to have a higher response rate for the pre-program and post-program GIPA surveys. Pre-program surveys generally garner higher participation due to association with registration information and participant excitement regarding coming to the United States. The post-program survey will be delivered while participants are on site at GIPA, so it is a much more controlled environment. Participation in alumni surveys is likely to be high in general because the respondents are a group that historically is active in seeking follow-on GIPA information, contact, and courses, thus meaning that if a respondent participated in a course, they’d be more likely to participate in an alumni survey as well. The interest of staying in touch with the providing organization is also common in other international exchange and training programs similar to GIPA.


And finally, the Federal Consulting Group (FCG) team has worked with other Departments, including the Departments of State and Defense, to conduct surveys of international audiences and in similar surveys have seen response rates averaging around 36% for well-established surveying conducted with similar methodology. The experience of FCG has been surveying similar international audiences who have participated in United States Government sponsored education and training programs. Based on this, the USPTO is projecting an average response rate of 35% for the foreign officials. These rates are consistent with similar follow-on surveys conducted by other training providers in the private sector.


Compensating for Non-Response Bias


The USPTO has demographic data on the universe and has provided that data to FCG. The demographic data includes the country of residence, program course, participation dates, and the sex of the participant. The FCG will be tracking these variables while the survey is open to ensure that the final response total is representative across them. The survey invitations contain a link that is uniquely tied to that particular respondent. The survey is also linked to the demographic data. This allows FCG to track the survey responses against the variables to ensure that a representative total is being generated. The unique survey link also allows FCG to see who has responded to the survey in real-time.


Since the survey distribution is based on a census approach (all course participants have the opportunity to participate) by course and fiscal year, the use of statistical weighting for a sample is not anticipated to be necessary. The first step in the process; however, is to track the proportional response rates to the universe by key demographic characteristics to identify if there is a gap in response characteristics versus non-response. If a gap is determined, the response rate with calculation for confidence level and interval will be calculated for the respondent population subset. If the subset confidence level and interval is at least a 95 percent confidence with plus or minus 3 percent margin of error, no further non-response subsampling will be conducted. If the margin of error is greater than 3 percent, non-response subsampling will be conducted.


If it becomes apparent that the responses are not representative under one or more of the variables, the FCG staff will send survey reminders to those individuals who have not yet completed their surveys to prompt additional responses. If sending the survey reminders fails to generate a representative response, FCG staff will contact the non-respondents by phone in an attempt to obtain their responses. The FCG may also work with GIPA staff to interview or conduct a focus group with the non-responding group to determine if their responses would differ from those who have responded to the survey.


  1. Procedures for Collecting Information


Data collection methods will primarily be conducted via an online survey platform with the survey link delivered to participants as an e-mail. Potential respondents will receive the same survey. Several questions are structured to have random response order to diminish response order bias. All respondents will be identified as going to, or have concluded participation in a GIPA program, including completion of a pre-course survey.


Names and most recent contact information are provided to the external contractor by GIPA. Contact information is updated by the contractor, in conjunction with the U.S. Embassies and the program office. All participants in the collection universe are sent an initial e-mail or a written notice, or contacted by telephone, to inform them of the evaluation and asking them to participate.


The data collection method will be the same for the pre-program, post-program, and alumni surveys. No survey will be open for a period of more than four weeks to allow FCG to gather the responses. In the case of non-responses, FCG will send e-mail reminders to those participants in order to maximize the response rate and generate a representative response.


If the USPTO decides to conduct the surveys in-country, the data collection methods will vary according to the particular program and the countries in which the data is being collected. In general, however, the GIPA surveys use a combination of the following data collection methods: paper surveys, web-based surveys, face-to-face structured interviews, telephone interviews, in-depth open-ended interviews, and focus groups. Factors used to determine the data collection methods in any given country relate to the availability of Internet access and telephone service, the reliability of the postal service, and the cultural and political attitudes (and apprehensions) towards surveying. For each evaluation, the data collection methods are discussed in detail, and specific country plans are developed with a contingency plan in place.


Statistical Methodology


Survey responses are not weighted. The research design is such that the census should be representative of the country populations, and thus parallels that of the defined universe. There are no unusual problems requiring specialized sampling procedures.


Data are usually collected only once from any given individual during a specific evaluation. However, the nature of the survey structure requires that data be collected from participants before, during, and after their exchange programs.


The surveys generally will be open for a period of four weeks to allow for responses to be gathered. This should also provide sufficient time to identify new e-mail addresses for those GIPA participants who may have changed e-mail addresses within the last year.


Analysis will be based upon response frequencies and simple cross-tabulations. Use of response data for GIPA performance measures is based upon multi-variable components that are reported as indices. This approach is currently used by other Departments, including the Department of State. Once the full evaluation system is in place, it is anticipated that more rigorous statistical analyses will be implemented.


  1. Methods to Maximize Responses


In order to maximize the number of responses received from the survey, the USPTO intends to:


  • Provide a pre-survey notification to foreign officials through Embassies. In cases where foreign officials who have attended GIPA training programs are being surveyed, the USPTO believes that a pre-notification about the survey coming from a recognizable contact at the particular Embassy will make the recipient more receptive to the survey than just receiving the e-mail invitation without a prior introduction. An example of such a notification is included in this submission (Ref. A).


  • Assure all participants that the responses will be kept confidential and only aggregated data will be used.


  • Conduct the survey using the Federal Consulting Group, an independent evaluation organization.


  • Conduct the survey via e-mail, a known and convenient method of communication with the participant group. FCG staff will be tracking the e-mail invitations that are sent out. If the e-mail addresses are bad, the e-mail in question will be “bounced” back to FCG, who may track these particular e-mails for the percentage of bad e-mail addresses. If there are a significant number of bad e-mails, the USPTO will contact the Embassies to see if new e-mail addresses for these individuals can be determined.


  • Provide follow-up reminders periodically to those individuals who have not responded to the survey (Ref. B).


The survey questions are primarily close-ended and brief to allow participants to answer the questions quickly and easily, even with English as a second language. The survey questions will be designed so that they are easy to answer, and follow-up will be repeated when necessary to maximize response rates.


All respondents will also receive the name and number for a GIPA contact to answer any questions for participants.


In addition, data collection instruments will be pre-tested with test respondents similar to the evaluation target audience. (See item 4 below).


All data collection methods are tailored to fit the prevailing political, cultural, safety, security, and accessibility conditions in each country in which participants are involved in an evaluation or performance measurement study. Initial contact with prospective respondents is conducted through e-mails or letters, and, when possible, telephone calls/telephone interviews are conducted. An example of a possible script for a telephone interview is included in this submission (Ref. C). Follow-up reminders will be sent periodically to non-respondents to encourage them to respond. The USPTO expects that the efforts described above, in combination with pre-testing, will stimulate the response rates.




  1. Testing of Procedures


The questions that will be asked are similar to those used by other Federal agencies in their participant surveys. Results from these surveys should make the agency more effective and efficient in responding to participants.


While the data is being collected, special attention will be paid to (a) the percentage of participants contacted, (b) participant response rate, (c) properly worded questions to reflect intent, (d) questionnaire completion rates, (e) response rates of individual survey questions, (f) records on problems arising in the data collection, and (g) comments the agency receives regarding the survey.


The FCG pre-tests the survey and delivery to ensure clarity, brevity, relevance, user-friendliness, understandability, sensitivity, and that most respondents will be willing to provide answers. For example, two pre-tests for the alumni survey have been conducted by distributing the survey through an e-mail link or by regular mail. Pre-testing allowed for some question revision, especially in cases where the answers may not translate as clearly to foreign cultures. The second pre-test primarily tested the delivery system. Several issues were identified that need revision and so the USPTO will use the successful SurveyMonkey system until the revisions are finalized.


For the pre-program and post-program surveys, pre-testing may also include individual follow-up telephone conversations with selected respondents conducted by FCG. Pre-testing may also be conducted through focus-groups, with individuals meeting to go over the instruments. The focus groups are anticipated to be conducted with current classes offered at GIPA, so as not to incur additional costs. FCG uses the pre-testing to clarify instructions and questions, refine the response categories, and add new questions as necessary. FCG’s team of professional evaluators have been conducting survey design, sampling, and data analysis for federal clients an average of 15 years each.


The pre-test focus groups have not been approved by OMB. The pre-program and post-program surveys are modeled on surveys that have been previously approved by OMB for other U.S. government agencies. If pre-test focus groups are used for each of these instruments, they will consist of 9 or fewer people, drawn from the GIPA participants, per data collection, per PRA guidelines. If the independent third-party evaluators (FCG) determine that additional pre-testing is needed for the data collection instruments, then USPTO will seek OMB approval for the additional pre-testing of the pre-program and post-program surveys.


The research topics in the survey are based upon the training impact evaluation approach developed in 1959 by Donald Kirkpatrick, former President of the American Society for Training and Development. The four-level approach, which measures participant reaction, learning, application of behavior and results, is well-established in the United States and overseas. The specific questions are based upon models that have been developed by evaluation and survey design specialists, social science research analysts and organizations, including Knowledge Advisors, Toyota Motor Sales, Defense Acquisition University, Caterpillar, the U.S. Department of State, the U.S. Department of Defense, Cisco Systems, and Grupo Iberdrola (Spain). In particular, the question models have been successfully tested and deployed to over 10,000 U.S.-sponsored international visitors and foreign officials since 2003. While the question construction and response scales have been utilized with these audiences, the response categories have been adapted to address specific topic areas for Intellectual Property protection and enforcement. Questions also focus on each of the particular areas for which GIPA provides training, including Patents, Trademarks, Copyright, and Enforcement.


  1. Contact for Statistical Aspects and Data Collection


The Office of the Administrator for Policy and External Affairs, Global Intellectual Property Academy is responsible for coordinating and administering the participant surveys and related data collection efforts.  Susan Anthony, Acting Director of the Global Intellectual Property Academy, can be reached via phone at 571-272-1500.


The USPTO has engaged the Federal Consulting Group through an Interagency Agreement to provide expertise in sampling, survey methodology, and statistical analysis.  Primary point of contact for FCG is Victoria Frank, Executive Consultant, who can be reached at 202-208-4040, or [email protected].  







References



  1. Sample Pre-Survey Notification

  2. Sample Follow-Up Reminder

  3. Sample Telephone Interview Script






7


File Typeapplication/msword
File Title0065 Supporting Statement Part B
AuthorUnited States Patent and Trademark Office
Last Modified ByJohn Eby
File Modified2014-05-29
File Created2014-05-28

© 2024 OMB.report | Privacy Policy