Supporting Statement A 2.13.08

Supporting Statement A 2.13.08.doc

Evaluation of Customer Satisfaction with the ATSDR Internet Home Page and Links

OMB: 0923-0028

Document [doc]
Download: doc | pdf



SUPPORTING STATEMENT OF THE REQUEST FOR

OMB REVIEW AND APPROVAL OF



Evaluation of Customer Satisfaction with the
Agency for Toxic Substances and Disease Registry
Internet Home Page and Links






















February 11, 2008


Project POC:

James E. Tullos, Jr., MS

Public Health Advisor

ATSDR/DTEM

1600 Clifton Road, NE MS F-32

Atlanta, GA 30333

JTullos@cdc.gov

770-488-3498 (phone)

770-488-4178 (fax)


A. Justification


  1. Circumstances Making the Collection of Information Necessary


In 1980, Congress created the Agency for Toxic Substances and Disease Registry (ATSDR) to implement the health-related sections of laws that protect the public from hazardous wastes and environmental spills of hazardous substances. The Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA), commonly known as the "Superfund" Act, provided the Congressional mandate to remove or clean up abandoned and inactive hazardous waste sites and to provide federal assistance in toxic emergencies (See Attachment – A). As the lead Agency within the Public Health Service for implementing the health-related provisions of CERCLA and its 1986 amendments, the Superfund Amendments and Re-authorization Act (SARA), ATSDR received additional responsibilities in environmental public health. This act broadened ATSDR’s responsibilities in the areas of health assessments, establishment, and maintenance of toxicological databases, information dissemination, and medical education. ATSDR also works closely with the States, their Departments of Health, regional and local healthcare provides, national organizations, academia, community-based advocacy groups, and the general-public to carry out its mission of preventing exposure to contaminants at hazardous waste sites and preventing adverse health effects. ATSDR provides funding and technical assistance for states to identify and evaluate environmental health threats to communities (Attachment – B).


In accordance with the Government Performance and Results Act of 1993 (P.L. 103-62) [See Attachment – D], the e-Government Act of 2002, and the Federal Enterprise Architecture are key elements of the Presidents Management Agenda. These “e-government” initiatives impact directly, all staff at all levels of the federal government, with the improvement of program effectiveness and public accountability by promoting new focuses on results, service quality, and customer satisfaction. These staffs are further charged with the responsibility to articulate clearly the results of their programs in terms that are understandable to their customers, their stakeholders, and the American taxpayer. This project addresses these expectations and serves to improve ATSDR’s health promotion agenda by providing data on which to assess and improve the usefulness and usability of information provided via the Internet.



  1. Purpose and Use of Information Collection


ATSDR is requesting a reinstatement with changes of an information collection that expired on
2-28-2007. This request if approved will allow ATSDR to continue its data collection efforts following guidance articulating compliance or consistency with “e-Government” Initiatives of 2002. Our current survey, the “
ATSDR Web Site User Satisfaction Survey,” was combined, in the past, under project 0920-0449 “Evaluation of Customer Satisfaction of the CDC and ATSDR Internet Home Page and Links.” However, we found that by having our own survey, we are better able to tailor the survey to our needs, manage the project effectively, and ensure that we collect the necessary information to evaluate customer satisfaction of our website. The 2003 reinstatement request was further modified by our most recent I-83C submission adding four replicate product-specific surveys to the OMB 0923-0028 inventory for this project. Post I-83C submission approval by OMB, ATSDR added a fifth product-specific survey using, the same replicate format, titled “ToxProfilesTM CD-ROM User Satisfaction Survey.” ATSDR is requesting a reinstatement with change for the following surveys.

    • ATSDR Web Site User Satisfaction Survey (WSUS) (Attachment – G)

    • Toxicological Profiles User Satisfaction Survey (TPUS) (Attachment – H)

    • ToxFAQsTM User Satisfaction Survey (TFUS) (Attachment – I)

    • Public Health Statements (PHS) User Satisfaction Survey (PHSUS)
      (Attachment – J)

    • Toxicology Curriculum for Communities Training Manual User Satisfaction Survey (TCCUS) (Attachment – K)

    • ToxProfilesTM CD-ROM User Satisfaction Survey (TP-CDUS) (Attachment – L)

Information presented on our website addresses the toxicology of chemical substances and on prevention of exposure, adverse human health effects, and diminished quality of life associated with exposure to hazardous substances from waste sites, unplanned releases, and other sources of pollution present in the environment. The ATSDR website also provides “Support Delivery of Services” for health promotional activities, product outreach, and future survey options currently under consideration. ATSDR invests considerable resources – staff, materials, supplies, time, monies – to prevent exposures and mitigate adverse human health effects of hazardous substances from waste sites, unplanned releases and other sources of pollution present in the environment. A large percentage of these resources are directed toward activities and interventions (or programs) focused on educating the public in order to prevent adverse health effects.


ATSDR has designed its website to serve the general-public, persons at risk for exposure to hazardous substances, collaborating organizations, state and local governments, and health professionals. It is critical that ATSDR have the capacity to answer whether or not these expenditures elicited the desired effects or impact. Specifically, this project will continue to examine whether our web-based initiatives meet the needs, wants, and preferences of visitors (“customers”) to the ATSDR Website. The results of this evaluation project will enable ATSDR to improve its service to these audiences and help ATSDR to meet their expectations of clearly communicated information that is easy to access. Also is clear, informative, and useful.


Analyses of the information collected will answer questions such as “Was the Home and link pages accessible?” “Was the link pages useful with up-to-date information?” “Was it easy to find information you needed?” “Was the content viewed clearly written in plain language?” In addition, results from these data are likely to assist us in answering more far-reaching questions, including: “Was the Web page load/response time satisfactory?” “Did you receive prompt response to requests submitted via the Web?” and finally, “What improvements or revisions should be made, if any?” Reinstatement of the project and use of the information collected will substantially increase our output evaluation capability and help address broader programmatic questions in the future.


Over the past three years, CDC and ATSDR have developed bold strategies to address the president’s Management Agenda (PMA), HHS guidelines and interpretative findings, and the CDC’s “Futures Initiative.” These strategies included the realignment of CDC’s Centers, Institutes, and offices; creating new CDC Coordinating Centers housing the realigned CIO’s based on their capacity to add synergy to missions and programs they support; merging the front offices of NCEH and ATSDR adding laboratory capacity to the valuable environmental health work performed by ATSDR. The new Coordinating Center for Health Information and Service (CCHIS) came about as an outgrowth of the “Futures Initiative.” Along with the creation of CCHIS, two new centers were formed to manage the delivery of CDC’s public health information streams. The National Center for Health Marketing and the National Center for Public Health Informatics and other supportive units enable ATSDR to increase collaborative efforts, integrate programs, and inform the public in innovative ways. Concurrent with CDC’s organizational realignment, IT infrastructural realignment has ramped up to achieve “e-Government” initiatives. During 2005-2006, Health Marketing (NCHIS) assumed control of all web-based informational structures, Public Health Informatics (NCPHI) evaluated the informational content of systems, program offerings, and evaluation processes, and ITSO controlled the consolidation efforts of the IT infrastructure. The conceptual strategy employed was the achievement of “e-Government” initiatives as quickly as possible. During this same period, ATSDR experienced a more rapid realignment strategy that required the porting of much of the agency’s IT demands to architectural structures within CDC. The ATSDR website system, intranet and internet, received compatibility re-engineering actions and then ported on to CDC Headquarters’ greatly expanded IT infrastructures and systems.


This evaluation project experienced webpage activation delays for some of the product-specific surveys, the website homepage survey was temporarily removed from active status for a period, and most of the database structures were declared inappropriate for realignment with some substructures failing new data security tests. All project databases have been returned to the ATSDR database test server awaiting re-engineering assignments. The ATSDR WSUS Survey will require back-end reconstruction in order to capture data correctly under the new CDC IT approach. Four of the five product-specific surveys were returned to live status on the ATSDR website, in August 2006, with an interim fix for capturing survey data received. Until the databases are re-engineered, the interim fix will stream captured data into script files similar to the type used for “Listserv Activity Logs.” This method applies to data collected from August2006-February 2007, the expiration date for each survey. This method will store each participant survey record by date and time received, delimited string listing of participant responses, and the name of the survey receiving the data. For the data to be considered useable by the project, the streaming file would have to be connected to its re-engineered database allowing the “data-download” to occur with line-by-line continuity comparison to validate each record received.


  1. Use of Improved Technology and Burden Reduction


CDC and ATSDR continue to implement the Presidents’ e-Government initiatives to address IT infrastructural requirements through capital planning and investment controls (CPIC), Consolidated Health Informatics (CHI) to assure a standardized use of health vocabulary and messaging, and further addressing OMB and HHS Information Quality Guidelines. During 2005-06, ATSDR began the migration of its computer infrastructure on to the CDC mainframes and networks to garner greater customer access to website portals, increased reliability, and improved capacity for future upgrades. With advanced mapping technologies, our customers and evaluation project respondents should experience these infrastructure advances through a vastly improved look, feel, and speed when they navigate our website. These actions overtime will greatly reduce the amount of time spent searching for information on the ATSDR site and will result in reduced burden on our customers and the subset of respondents to the ATSDR user satisfaction surveys.



  1. Efforts to Identify Duplication and Use of Similar Information


As previously cited, ATSDR used another web survey titled, “Agency for Toxic Substances and Disease Registry Customer Satisfaction Survey,” Form Approved OMB No. 0920-0449. This survey was approved for three years in 1999 and extended through December 31, 2003. The data collected from this survey was used to improve many facets of the ATSDR Web site. However, this project ascertained that the survey might have been too lengthy resulting in a much lower than expected return rate. The surveys in this evaluation project are much shorter and require five minutes or less to complete. ATSDR staff has replaced the previous survey, which expired on September 30, 2003. Currently these user satisfaction surveys are the only on-going way of obtaining feedback from Web customers.



  1. Impact on Small Businesses or Other Small Entities


ATSDR has designed this site to serve the general-public, persons at risk for exposure to hazardous substances, collaborating organizations, state and local governments, and health professionals. ATSDR has taken steps to improve the speed and functionality of its website to reduce the burden for time spent navigating our site to find points of interest for all who visit as well as those visitors who voluntarily choose to respond to one or more of our user satisfaction surveys. ATSDR staff is keenly aware that the environmental health information and educational products we disseminate are of interest to select segments of the business community. As cited in A.4.0, our previously approved surveys are designed to use the fewest number of questions possible, which benefits all visitors to our website.



  1. Consequences of Collecting the Information Less Frequently


ATSDR evaluators envision that respondents will complete the survey once a year. The surveys will remain active throughout the year over the next three years. Data collected from the surveys will be analyzed every three months. The survey response process is anonymous; no personal identifiers are collected. Therefore, some participants could voluntarily choose to complete the survey on more than one occasion. To structure the survey process differently would require the addition of personal identifier fields and the use of passwords to limit or control access to the ARSDR website. The data collected is only meant to gather customer insights about select areas of our website.


Information on the ATSDR website is continually updated and augmented, both as new educational materials are produced, as environmental statistical surveillance activities are reported, and other data areas are updated. Again, respondents will only be asked to respond once. There will be no planned effort to gather multiple responses and no incentive will be offered for completing the survey more than once.


  1. Special Circumstances relating to the Guidelines of 5 CFR 1320.5


This data collection fully complies with the guidelines of 5 CFR 1320.5.


  1. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


8a. A 60-day notice was published in the Federal Register on December 20, 2006, Volume 71, No. 244, Page 76341 (Attachment –E). Attachment – F contains the one public comment received based on the 60-day FRN and ATSDR’s response to the public comment .in accordance with OMB PRA – ICR instructions.


8b. As previously cited in ATSDR’s original application, the methodology employed in this evaluation is of a standardized nature, outside consultation was conducted with staff at the National Institute of Health (NIH) that have done similar research, and with CDC staff that have conducted similar projects. The survey is designed in a manner consistent with surveys used by the U.S. Department of Health and Human Services. This survey was initially prepared with advice and counsel from the ATSDR Website Content Committee’s cognizance of similar efforts to evaluate Internet utilization (page hit) performance from 1997 to 2003 and by CDC’s Office of Communications review and comment of the results from the ATSDR customer satisfaction survey for the period from 1999 to 2003. ATSDR staff also consulted with the Nuclear Regulatory Commission (NRC) staff to discuss the questions posed by the NRC customer satisfaction survey. Based on this discussion, ATSDR staff concluded that the NRC survey questions would suffice for the desired information to be collected.


  1. Explanation of Any Payment or Gift to Respondents


No payments or gifts will be provided to survey respondents.


  1. Assurance of Confidentiality Provided to Respondents


CDC Privacy staff has determined that the Privacy Act does not apply to this data collection. Each survey will be linked to ATSDR’s Privacy Policy Notice (see Attachment – M).


This evaluation project does not collect sensitive and/or personally identifiable information. The respondents sought, by this project, are subsets of those individuals who visit the ATSDR Web site and home pages. This may include the general-public, members of local, state, or federal government, academia, or public health professionals. Respondents are not asked to provide their name, address, or any other identifiable information in this survey. Identifying information that could potentially be received would come from respondents choosing to send ATSDR additional comments, via e-mail, beyond the requirements of the survey instrument. E-mail addresses received in this manner will not be linked to any survey even if the comments presented referenced the survey concepts and processes as a matter of interest or concern. Survey responses will be submitted electronically received through ATSDR’s website interface; filed and retrieved by date and time received; with access limited to the project team. Respondents will be advised of the privacy of their responses on the “ATSDR Web Site User Satisfaction Survey” and each home page linked surveys previously approved for this evaluation project. Respondents will be told that they are not required to respond to each question and that they may leave blank any question.


Data will be treated in a secure manner, unless otherwise compelled by law. All staff working on the project will agree to safeguard the data and not make unauthorized disclosures. Data will be safeguarded in accordance with applicable statutes. Responses in published reports will be presented in aggregate form.


  1. Justification for Sensitive Questions


The “ATSDR Web Site User Satisfaction Survey” and replicate product-specific survey tools ask no questions that are considered “sensitive in nature. Each survey and its design was meant to address select questions previously presented in section 2, page-4, paragraph-4 which focuses more on speed of access, the ease in finding information needed, and readability of the information presented. With this concept in mind, adding questions to access “Race and Ethnicity” data far exceeds the needs of this project. This project was initially approved (1999), extended (2003), and subsequently reinstated in 2005 without incorporating these extra data points into the process and project outcomes. Therefore, we seek continued exemption under the HHS Policy for Inclusion of Race and Ethnicity in DHHS Data Collection Activities.


  1. Estimates of Annualized Burden Hours and Costs


A.12.A: As cited previously in our original application, ATSDRs website survey had been developed with collaborative assistance of NIH and CDC. Each group reports the performance of similar survey research with home page evaluations of websites and that the ATSDR Web Site User Satisfaction Survey,” survey design is consistent with surveys used by the U.S. Department of Health and Human Services. As part of our previous application, ATSDR did evaluate its previous customer satisfaction survey results garnered from 1999 to 2003 to assist in determining the level of response achieved and the characteristics of the survey instrument to ascertain likely survey outcomes if the instrument contained fewer questions. ATSDRs web site content committee also recommended instrument and database changes that would further reduce the burden on small businesses and other small entities. Web page access data was also analyzed for the 1999 to 2003 to determine respondent rate potentials for the recently approved product-specific replicate surveys.


The ATSDR Web Site User Satisfaction Survey” and the five replicate product-specific surveys will be used with approved changes throughout the proposed three-year reinstatement period. The surveys will continue to be posted on the ATSDR website with links from the ATSDR home page, “A-Z page,” and product-specific homepages that where selected. ATSDR staff envisioned that respondents would complete the survey once a year. The surveys will remain active throughout the year. Data collected from the surveys will be analyzed every three months. The survey response process is anonymous; no personal identifiers will be collected. The data collected is only meant to gather customer insights about select areas of our website. Table A.12.A illustrates our estimated annual burden hours for this proposed extension period. At this time, we estimate that approximately 80% of respondents are Individuals/Households, 10% represent the Private Sector, and 10% from State-Local-Tribal Governments.


Table A.12.A: Estimated Annual Burden Hours


Respondents & Percent of Form Name Use

Form
Name

Number of respondents

No. Responses per respondent

Hours per response

Response Burden (hours)

ATSDR Website Visitors (50%)

WSUS

1000

1

5/60

83

ATSDR Website Visitors (15%)

TPUS

300

1

5/60

25

ATSDR Website Visitors (15%)

TFUS

300

1

5/60

25

ATSDR Website Visitors (5%)

PHSUS

100

1

5/60

8

ATSDR Website Visitors (8%)

TCCUS

160

1

5/60

13

ATSDR Website Visitors (7%)

TP-CDUS

140

1

5/60

12

Total





166


A.12.B There are no direct out-of-pocket costs to the respondents for their participation in the survey. To calculate the estimated annual respondent cost, we used the mean hourly wage rate for all workers as determined by the Bureau of Labor Statistics National Compensation Survey, Oct 2006 ($16.91). We selected this wage rate because it addresses best the skill classifications for the general-public (e.g., blue-collar, white-collar, full-time, part-time, union, nonunion) across all geographic locations. The respondent costs would total $2,818.34 as is shown in Table A.12.B


Table A.12.B: Estimated Annual Respondent Cost


Respondents & Percent of Form Name Use

Form
Name

Number of respondents

Response Burden per respondent
(in hours)

Hourly Wage Rate

Response Burden
(annual cost)

ATSDR Website Visitors (50%)

WSUS

1000

5/60

$16.91

$1,409.17

ATSDR Website Visitors (15%)

TPUS

300

5/60

$16.91

$422.75

ATSDR Website Visitors (15%)

TFUS

300

5/60

$16.91

$422.75

ATSDR Website Visitors (5%)

PHSUS

100

5/60

$16.91

$140.92

ATSDR Website Visitors (8%)

TCCUS

160

5/60

$16.91

$225.47

ATSDR Website Visitors (7%)

TP-CDUS

140

5/60

$16.91

$197.28

Total





$2,818.34


  1. Estimates of Other Total Annual Cost Burden to Respondents or
    Record Keepers


There are no costs (capital, start-up, etc) for respondents to participate.


  1. Annualized Cost to the Government


The estimated annual cost to the federal government would be $9,001.00. Staff salary estimates are based on the average full time salary for Atlanta based staff at the GS-13, Step 5 level ($85,874.00). The breakdown of the cost is as follows:


$ 5,934 Staff salary for re-engineering of the database to “e-Government” specs, certifying data integrity and security of the data collection (6% FTE)

$ 1,978 Staff salary for compiling data, and analyzing data (2% FTE)

$ 989 Technical support and consultation (1% FTE)

$ 100 Supplies (Paper, printing costs, etc)

$ 9,001 Total estimated costs


  1. Change in Burden Hours


Table A.12.A above reflects the level of respondent activity and the estimate of annual burden hours set for this project. In 2005, an I-83C form was submitted that incorrectly represented the burden hours when it requested to add the four replicate product-specific surveys described in that document. Prior to this request, the total annual burden was 83 hours. The 2005 effort added the product-specific surveys, increased the number of respondents from 1000 to 2000, and reported a reduction of 59 annual burden hours from the current record at that time. Increasing the number of respondents from 1000 to 2000 should have increased the total annual burden hours to 166, (83 hours x 2). Furthermore, the I-83C form reported four replicate surveys though five were produced. The fifth survey might have been left out because of space availability of the form. This reinstatement effort acknowledges the fifth, product-specific replicate survey, changes the total annual burden hours to 166, without increasing the number of respondents reported. The burden table, A.12.A, corrects the record to reflect the continued use of each of the six surveys currently in our project inventory. The respondent estimates were based on a 1% or least voluntary acceptance rate from those visitors accessing each product page. These assumptions were based on the total number of webpage hits reported for the ATSDR homepage and each product-specific link pages tabulated in 2004. This statement will also be added to the ICR worksheet and ICRAS 4 (electronic form) to correct this calculation error.



  1. Plans for tabulation and Publication and Project Time Schedule


Tabulation

The assessment strategies for this data collection cited in ATSDR’s previous application will continue through the project period proposed. As responses are received, each record will be downloaded to an “e-Government” compliant database and analyzed on a quarterly basis, sooner than previously cited. The database will be randomly accessed to insure data integrity and continuity at least six times per project year. Tabulation plans for each survey type is similar. Data will most likely be analyzed using SAS, Microsoft Access, and/or Microsoft Excel using primarily descriptive statistics (e.g., frequency counts, measures of central tendency, and cross-tabulations), and graphical display. Generalizability to a broader population is not intended for this data collection. The following table shells are samples of possible layouts for presenting the results from the aggregate analysis.


Sample 1: “Q: Information was useful to your needs?”



Frequency

Percent of Total

Low



Medium



High



Superior



Total




Sample 2: “Q: Was Information on Topic Available?” by
“Q: Was Information Easy to Locate?”



Yes

No

N/A

Low




Medium




High




Superior




Total





Publication

Study results will be used internally by ATSDR to improve the usefulness of the Website. No effort to generalize or externally publish any data or information from this collection is planned.


Project Time Line


Resources permitting, the estimated project time line is below.


Project Time Schedule

Year 1

Activity

Time Schedule

All surveys will be deactivated prior to the 2-28-07 expiration date.

All surveys were deactivated.

Submit remaining “e-Government” initiative database realignment requests for the TCCUS, TP-CDUS, and the WSUS surveys

Current action during the OMB approval process

Test ITSO’s modified data collection for the above mentioned surveys

Current action during the OMB approval process

Maintain “Live Status” for data collection for the five product-specific surveys

Current action during the OMB approval process

Test databases for continuity after “e-Government” initiatives have been completed

Per ITSO database completion release schedule

Implement monthly surveillance activities (by ITSO) to insure that the modified collection approach (Script Capture) is achieving project goals

During OMB approval or up to 1-3 Months after OMB Approval

Receive clearance to reactivate WSUS survey to collect visitor comments on “e-Government” changes

During OMB approval or up to 1-3 months after OMB Approval



Stream script capture data into re-engineered databases for each survey (Upon ITSO release)

2-6 Months after OMB Approval

Quarterly download and analysis of data collections

5-9 Months after OMB Approval

Provide bi-annual agency reports on results of analysis of each data collection

6-10 Months after OMB Approval, then every six months

Years 2 and 3

Activity

Time Schedule

Maintain live survey status and continuous data collection for all surveys

365 days per year

Continue quarterly download and analysis of data collections

On-going from Year 1

Team member staff training on project activities

On an As-Needed Basis

Provide bi-annual agency reports on results of analysis of each data collection

On-going from Year 1



  1. Reason(s) Display of OMB Expiration Date is Inappropriate


The OMB expiration date will be displayed on the data collection instruments.



  1. Exceptions to Certification for Paperwork Reduction Act Submission


No exceptions are being requested or pursued.



B. STATISTICAL METHODS


  1. Respondent Universe and Sampling Methods


The project will seek to gain its survey respondent segment from the consumer base that visits ATSDR’s website, its homepage, and product-specific linked pages selected for sampling within this project. Prominently placed survey announcement links will be positioned near the font enlarged webpage title such as a product title if it is for one of the selected product-specific homepages, or in some other prominent location on the ATSDR Website Homepage. Reading the short announcement phrase/sentence and clicking on the link will transport the respondent to one of the six surveys proposed by this project. Choosing to complete the survey is voluntary. Each survey will result in the production of an anonymous record. Completing the survey and clicking the “Submit Button” will cause a record to be created which implies that the respondent has consented voluntary participation in this survey project.



  1. Procedures for the Collection of Information


Analyses of the information collected will answer questions such as “Was the Home and link pages accessible?” “Was the link pages useful with up-to-date information?” “Was it easy to find information you needed?” “Was the content viewed clearly written in plain language?” In addition, results from these data are likely to assist us in answering more far-reaching questions, including: “Was the Web page load/response time satisfactory?” “Did you receive prompt response to requests submitted via the Web?” and finally, “What improvements or revisions should be made, if any?” Continuation of the project and use of the information collected will substantially increase our output evaluation capability and help address broader programmatic questions in the future.


As cited in section A.2.0 herein, CDC/ATSDRs strives to achieve the Presidents’ “e-Government” Initiatives has presented unforeseen challenges for this project in the form of webpage activation delays for some of the product-specific surveys. The ATSDR website homepage survey was temporarily removed from active status for a period and most of the database structures were declared inappropriate for realignment with CDC. Some of the database substructures failed the new data security tests. All project databases have been returned to the ATSDR database test server awaiting re-engineering assignments. The ATSDR WSUS Survey will require back-end reconstruction in order to capture data correctly under the new CDC IT approach. Four of the five product-specific surveys were returned to live status on the ATSDR website, in August 2006, with an interim fix for capturing survey data received. Until the databases are re-engineered, the interim fix will stream captured data into script files similar to the type used for “Listserv Activity Logs.” This method applies to data collected from August2006-February 2007, the expiration date for each survey. This method will store each participant survey record by date and time received, delimited string listing of participant responses, and the name of the survey receiving the data. For the data to be considered useable by the project, the streaming file would have to be connected to its re-engineered database allowing the “data-download” to occur with line-by-line continuity comparison to validate each record received.


The survey data collected, normally, would be downloaded quarterly for analysis. During the analysis, if respondent comments reflect highly intuitive suggestions or pointed negative concerns the project team will release a “Quarterly Spot Report” reflecting the insights gained. If a message were received of a distressing nature, purporting harm to self or others it would be passed on to management for immediate consideration of appropriate action. Spot reports can also be used to reflect survey activity counts and other basic mathematical derivations, but without full analytical discussions or summative findings. On a bi-annual basis data retrieved from each survey would be fully analyzed, to the extent allowed by project design goals and objectives, reflecting summary findings for management review and program improvement. Ultimately, copies will be forwarded on to ATSDR’s Website Development Committee to support future website designs and IT resource enhancements.


  1. Methods to Maximize Response Rates and Deal with Non-response


As previously cited in our reinstatement application, the protocol described herein has been developed to provide maximum response rate among potential participants. Since the survey is expected to take only five minutes to complete, the short length of time is expected to act as an incentive for completion. The survey will be marketed to site visitors as a way of valuing their input to the site’s improvement.


  1. Tests of Procedures or Methods to Be Undertaken


This survey has been tested among staff for readability, comprehensibility, and time to complete. The procedures are standard to numerous customer satisfaction studies focused on the internet. In fact, the survey is very similar to the one approved by OMB for the Nuclear Regulatory Commission (OMB No. 3150-0197). The ATSDR Web Site User Satisfaction Survey and product-specific linked surveys will receive no additional testing or evaluation beyond the reviews stated above and agency clearance requirements provided for previous approved OMB clearance documents.



  1. Individuals Consulted on Statistical Aspects

The individuals listed are the members of the project and are actively involved in project accountability, instrument design, and the statistical aspects relevant to the objectives of this project.

James E. Tullos, Jr., BS, MS

Cassandra V. Smith, BS, MS

Public Health Advisor (Project POC)

Environmental Health Scientist

Environmental Medicine Team

Applied Toxicology Branch

Environmental Medicine and Education Services Br.

Division of Toxicology and Environmental Medicine

Division of Toxicology and Environmental Medicine

Agency for Toxic Substances and Disease Registry

Agency for Toxic Substances and Disease Registry

Phone: 770-488-3319

Phone: 770-488-3498

E-mail: CVSmith@cdc.gov

E-mail: JTullos@cdc.gov



Ed Murray, PhD

Michael T. Hatcher, DrPH (Project Consultant)

Chief, Applied Toxicology Branch

Chief, Environmental Medicine and

Division of Toxicology and Environmental Medicine

Education Services Branch

Agency for Toxic Substances and Disease Registry

Division of Toxicology and Environmental Medicine

Phone: 770-488-3317

Agency for Toxic Substances and Disease Registry

E-mail: EMurray#cdc.gov

Phone: 770-488-3489


E-mail: MHatcher@cdc.gov



Renee Peters


Computer Programmer I/Lock Mart


Applied Toxicology Branch


Division of Toxicology and Environmental Medicine


Agency for Toxic Substances and Disease Registry


Phone: 770-488-3329


E-mail: RPeters1@cdc.gov






References



President’s Management Agenda”; Executive Office of the President, Office of Management and Budget; 2002; Accessed 2 December 2006;
available online at
http://www.whitehouse.gov/omb/budget/fy2002/mgmt.pdf

The E-Government Act of 2002”; Executive Office of the President, Office of Management and Budget; (H.R. 2458/S. 803); Accessed 2 December 2006;
available online at
http://www.whitehouse.gov/omb/egov/g-4-act.html

Federal Enterprise Architecture (FEA)”; Office of Management and Budget; 2002; Accessed 2 December 2006; available online at http://www.whitehouse.gov/omb/egov/a-1-fea.html

Consolidated Reference Model (CRM) Version 2.0”; Office of Management and Budget; 2006; available online at http://www.whitehouse.gov/omb/egov/documents/FEA_CRM_v20_Final_June_2006.pdf

Policy Statement on Inclusion of Race and Ethnicity in DHHS Data Collection Activities”; 1997; Accessed May 31, 2007; available online at: http://aspe.hhs.gov/datacncl/inclusn.htm



List of Attachments:


  1. Authorizing Legislation – CERCLA

  2. ATSDR Cooperative Agreement Program Description

  3. ATSDR Vision, Mission, and Strategic Goals

  4. Government Performance Results Act of 1993 (GPRA)

  5. 60-day Federal Register Notice

  6. Public Comments to the 60-day FRN and ATSDR Response to Public Comments

  7. ATSDR Web Site User Satisfaction Survey (WSUS)

  8. Toxicological Profiles User Satisfaction Survey (TPUS)

  9. ToxFAQsTM User Satisfaction Survey (TFUS)

  10. Public Health Statements (PHS) User Satisfaction Survey (PHSUS)

  11. Toxicology Curriculum for Communities Training Manual User Satisfaction Survey (TCCUS)

  12. ToxProfilesTM CD-ROM User Satisfaction Survey (TP-CDUS)

  13. ATSDR Website Privacy Policy Notice



1


File Typeapplication/msword
File TitleDTEM-ATB, Satisfaction Surveys_OMB S.S. v2.5, 6-27-07
SubjectOMB Clearance for ATSDR Customer Satisfaction Surveys
AuthorJames E. Tullos, Jr.
Last Modified Byziy6
File Modified2008-02-13
File Created2008-02-13

© 2024 OMB.report | Privacy Policy