Supporting Statement B For:
Generic Submission for Technology Transfer Center (TTC)
External Customer Satisfaction Surveys (NCI)
February, 2011
John D. Hewes, Ph.D.
Technology Transfer Specialist
Technology Transfer Center
National Cancer Institute
6120 Executive Blvd., MSC 7181
Suite 450
Rockville, MD 20852
Phone: (301) 435-3121
Fax: (301) 402-2117
E-mail: hewesj@mail.nih.gov
Table of Contents
B. Statistical Methods 1
B.1 Respondent Universe and Sampling Methods 1
B.2 Procedures for the Collection of Information 6
B.3 Methods to Maximize Response Rates and Deal with Nonresponse 8
B.4 Test of Procedures or Methods to be Undertaken 11
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 12
List of Attachments for Supporting Statements A and B
Attachment 1: GEN IC - TechnOlogy Transfer center (TTC) External Customer CONSENT FORM AND Satisfaction Survey (NCI)
Attachment 2: ADVANCE NOTICE postal and e-mail Letter
Attachment 3: Followup E-mail
attachment 4: Reminder to participate e-mail
attachment 5: Additional Reminder to participate e-mail
attachment 6: final Reminder to participate e-mail
attachment 7: List of advisory Committee members
attachment 8: nih privacy officer’s letter
attachment 9: Office of Human Subjects Research (OHSR) exemption
attachment 10: Pre-Test FOLLOW-UP TELEPHONE INTERVIEW
AttACHMENT 11: Privacy Impact ASSessment (PIA)
The universe of potential survey participants consists of companies that conduct research and development in biomedical applications: pharmaceutical and biotechnology therapeutics, diagnostics, prognostics, medical devices, and software and are listed in the Technology Transfer Center Database. The survey universe (target population) includes company personnel who have utilized the services of the National Cancer Institute (NCI) Technology Transfer Center (TTC), “users” and “non-users” of TTC services, and those who have inquired about using TTC services, “inquirers.” The sex and race/ethnic composition of the respondent universe is not germane to the survey.
In proposing a sampling plan for the surveys, three sampling approaches were considered: a simple random sample, a stratified random sample, and a full population sample. These approaches are discussed below.
Simple Random Sample
There are 2,052 foundations, non-profit organizations, and for-profit companies that comprise the population of interest annually. A simple random sample of these companies is likely to retain the companies’ characteristics in the same proportions as the population as a whole within normal sampling variability. This method assumes that response rates would be equal across company characteristics, e.g., representatives of large companies are just as likely to respond as representatives of small companies, and there is no known influence of other company characteristics that would unduly influence the response rates.
Table 1. Universe Covered by the Technology Transfer Center External Customer Satisfaction Surveys |
|||
Category of User of TTC Services |
Number of Entities in the Universe |
Target Sample Size |
Number of Invitees in First Wave |
Users of TTC Services |
1,394 |
489 |
612 |
Non-Users of TTC Services |
627 |
343 |
429 |
Inquirers about TTC Services |
31 |
30 |
31 |
Total |
2,052 |
862 |
1,072 |
a. Survey Procedures
These are web-based surveys utilizing the Technology Transfer Center (TTC) External Customer Consent Form and Satisfaction Surveys (Attachment 1) that will be administered several times over a period of three years. Typically, each participant in the study will receive a postal advance notice and invitation followed by an e-mail advance notice and invitation to participate in the surveys (Attachment 2) from the NCI Project Officer that will explain the purpose of the survey, provide information about the confidentiality of their responses, and ask them to participate in the survey. One week later each participant will receive an e-mail invitation to participate sent by the Contractor (Attachment 3). This e-mail will contain a secure URL and password to access the web-based Technology Transfer Center External Customer Satisfaction Surveys. Two weeks after the e-mail invitation to participate has been sent, an e-mail “reminder to participate” (Attachment 4) will be sent by the Contractor to all survey non-respondents. An additional reminder to participate e-mail (Attachment 5) will be sent to non-responders one week prior to the closing date of the survey. A final reminder will be sent one day prior to the close of the survey (Attachment 6). Details of the announcement, invitations, and reminders (if used) will be included in the justification memos for subsequent generic IC sub-studies under this clearance.
b. Rationale for Sample Size
The population universe of 2,052 companies will be used for the TTC surveys for the following reasons:
Quality Control.
The Contractor chosen for this study will establish and maintain quality control procedures to ensure standardization and a high level of quality of data collection and processing. The Contractor will maintain a written log of all decisions that affect study design, conduct, or analysis. Survey responses will be recorded electronically by the survey website. The Contractor will monitor performance of the data monitoring activities, especially with regard to response rates and completeness of acquired data. The Contractor will monitor response rates on a daily basis and download response data weekly while the survey is active. Each week’s data will be compared with the previous week’s data to identify any discrepancies. Once the survey has closed, manual spot-checks will be performed to ensure that results have been compiled and downloaded accurately.
Several methods will be utilized to maximize response rates. The initial postal and e-mail invitations will come from the National Cancer Institute (NCI)—a source known and respected by the intended audience. Potential respondents’ participation will be requested in advance. Clear instructions will be provided on how to complete and submit the survey. Based on the results of the pre-test described below, the survey is designed for the intended audience and is easy to read and follow. Finally, the Contractor will send the follow-up invitation and reminder e-mails via the NIH e-mail accounts to increase credibility and decrease the likelihood of the e-mails ending up in the recipients’ junk mail folders.
The data collection method will be a web-based survey. The use of web-based technology for this survey was selected to reduce respondent burden, reduce costs, and improve data quality. Answering a set of questions via a web-based survey requires half as much time as answering the same set of questions via a telephone survey5. Data collection using web-based technology can improve data quality since validation checks can be incorporated into the instrument. In addition, because data are entered electronically, errors in data entry and coding are avoided. Many consumers now prefer the convenience of web-based surveys because they can be completed at a comfortable pace and at a time convenient for the respondent.
Two disadvantages sometimes ascribed to web-based surveys are lower response rates and issues related to reliability and validity of data. Response rates for paper based and CATI surveys have been rapidly falling over the last two decades6 . Conversely, response rates for web-based surveys have been growing with increasing internet usage among the general public7. Research comparing response rates among web-based, mail, and telephone surveys has traditionally found that response rates are generally lower for web-based surveys than for mail or telephone surveys; however, a meta-analysis of more recent studies has found that on average, web-based surveys yield only an 11% lower response rate compared to other modes (95% confidence interval 6%-15%) and that response rates are greatly increased when using targeted e-mail invitations8. In addition, response rates have been found to be greater among highly educated individuals such as our survey population9. With regard to the reliability and validity of data, web-based surveys are superior to computer assisted telephone interviews (CATI) in terms of their ability to moderate the effect of the interviewer on influencing social desirable responses10. Studies comparing methodologies have found that web-based surveys had the lowest rate of missing data items and CATI the highest11. Web-based surveys are noted by many studies for delivering better verbatim responses than CATI surveys. Several studies argue that the quality of data collected through web-based surveys is generally higher than through CATI due to the respondent’s ability to re-read questions, elimination of time constraints or pressure, and completion at convenient times12. Studies using external data verification have found that web-based surveys have the highest levels of reporting accuracy and CATI the worst13. Based on the literature cited, we are confident that the response rate will be adequate and the data collected will be reliable and valid.
The types of respondents will be managers and executives from companies that conduct biomedical research and development in therapeutics, diagnostics, prognostics, medical devices, and software. Procedures for follow-up of initial non-responders are described in Section B2a. Since this data collection effort is a customer satisfaction survey it is anticipated that the accuracy and reliability of the information will be more than adequate for the intended uses of the data. The face validity of the initial survey instrument was confirmed by the pre-test follow-up telephone interviews.
The initial generic survey contains 38 questions. Depending upon the respondent’s answers and the skip patterns built into the survey, a respondent may be asked to answer between 10 and 34 questions. Based on the pre-test, it is expected that the survey will take from 15 to 20 minutes to complete depending upon the number of questions the respondent is asked to answer. The survey will collect the following types of information:
Respondent/company information: respondent’s current position in the company; company type; location of parent company; number of employees.
Strategic direction of the company: whether they develop strategic technology collaborations; types of strategic technology collaborations; research collaborations formed within the last two years; research collaborations anticipated in the next two years; factors important to selecting a research collaborator; methods for finding new research collaborators; types of research collaborations developed within the past two years; state of research and development of research collaborations formed in the past two years; collaborations formed with off-shore companies, government laboratories, institutes, and/or universities; reasons for seeking off-shore collaborations.
Experience with the NCI Technology Transfer Center (TTC) services: level of familiarity with NCI technology transfer services; how they learned about the NCI TTC; factors leading to not partnering with NIH researchers; factors leading to partnering with NIH; types of agreement completed; level of satisfaction with the length of time required to negotiate the agreement; feedback process; level of satisfaction with various aspects of the NCI technology transfer process; how they locate NIH research partners for potential collaborations; types of information the respondent would like to receive from the NCI TTC; preferred method of receiving NCI TTC information; additional TTC services that would be useful to meet the company’s technology transfer needs.
The basic formula (number of completed surveys/number of people contacted) will be used to calculate response rate. Based on the 80% response rate achieved for the pre-test and the fact that potential survey respondents are external customers who are managers and executives from companies that have a vested interest in biomedical and clinical research and a capability to pursue collaborations with NIH researchers, we project that 80% of the eligible 2,052 organizations will participate in this survey.
A pre-test of eight respondents was conducted to ensure that the web-based survey was clear, well-designed and easy to use. The potential pre-test respondents received an invitation letter followed by an invitation e-mail from the NCI Project Officer inviting them to participate in a 2-stage process—first to complete and submit the web-based survey and then to participate in a brief telephone interview to discuss their experience with the web-based survey. Interview questions addressed accessibility and navigation; ease of comprehension and relevance; usability; and acceptability. The Pre-test Follow-up Telephone Interview protocol is included as Attachment 10. The initial survey was revised based on information learned from the pre-test and subsequent surveys may be altered slightly based on information from the first survey.
Individuals consulted on statistical design:
Benmei Liu, PhD, Mathematical Statistician, Statistical Methodology and Applications Branch, Surveillance Research Program, Division of Cancer Control and Population Sciences, NCI. Phone: 301-594-9114
Frederick Snyder, ABD, Statistician, NOVA Research Company; Phone: 301-986-1891
Contractor staff members of NOVA Research Company and The Madrillon Group Inc. will collect and analyze the data for NCI. Individuals who will analyze the data include:
Mary. C. Dufour, MD, MPH, Senior Vice President, The Madrillon Group Inc.
Jack E. Scott, ScD, Vice President for Evaluation, The Madrillon Group Inc.
Marietta Damon, PhD, Evaluation Specialist, The Madrillon Group Inc.
Frederick Snyder, ABD, Statistician, NOVA Research Company
National Cancer Institute and Contractor investigators and statisticians and Advisory Committee members (Attachment 7) have reviewed the data collection plan. The data collected will be available for use in analyses that are proposed and approved in the future. No additional consultation is planned.
1 Wholey, JS, Hatry, HP, Newcomer, KB. (2004) Handbook of Practical Program Evaluation. (2nd Ed) Jossey-Bass, A Wiley Imprint, San Francisco, CA. 2004.
2 Fowler, FJ. (2009) Survey Research Methods 4th edition. Sage Publications, Thousand Oaks CA.
3For the purpose of this survey, collaborations are defined as strategic alliances for co-development of technologies but not licensing or contractual relationships.
4 Wholey et al., 2004
5 Van Gelder M.M.H.J., Bretveld RW, and Roeleveld N. (2010). Web-based Questionnaires: The Future in Epidemiology? American Journal of Epidemiology Advance Access published September 29, 2010, doi: 10.1093/aje/kwq291
6 Curtin R, Presser S, Singer E. (2005). Changes in Telephone Survey Nonresponse over the Past Quarter Century. Public Opinion Quarterly 69(1):87-98.
7 Kaplowitz M D*, Hadlock T D, Levine R. (2004). A Comparison of Web and Mail Survey Response Rates. Public Opinion Quarterly 68(1):94–101
8 Manfreda, K L, Bosnjak, M, Berzelak, J, Haas, I, Vehovar, V. (2008). Web Surveys Versus Other Survey Modes: A Meta-analysis Comparing Response Rrates. International Journal of Market Research; 50(1):79-104.
9 Braithwaite D, Emery J, De Lusignan S, et al. (2003). Using the Internet to conduct surveys of health professionals: a valid alternative? Fam Pract. 20(5):545–551.
10 Kreuter F, Presser S, Tourangeau R. (2008). Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity. Public Opinion Quarterly 72(5):847-865; doi:10.1093/poq/nfn063
11 ibid
12 Kwak N, & Radler B. (2002). A Comparison Between Mail and Web Surveys: Response Pattern, Respondent Profile, and Data Quality. Journal of Official Statistics 18: 257-273.
13 Tourangeau R, Yan T. (2007). Sensitive Questions in Surveys. Psychological Bulletin 133:859–83.
File Type | application/msword |
File Title | TABLE OF CONTENTS |
Author | Vivian Horovitch-Kelley |
Last Modified By | Vivian Horovitch-Kelley |
File Modified | 2011-02-08 |
File Created | 2011-02-08 |