Supporting Statement for the National Child Traumatic Stress Initiative Evaluation
JUSTIFICATION
1. Circumstances of Information Collection
Summary
The Center for Mental Health Services (CMHS), Substance Abuse and Mental Health Services Administration (SAMHSA), is requesting clearance for revised data collection associated with the National Child Traumatic Stress Initiative (NCTSI) Evaluation (OMB No. 0930-0276). The purpose of this program is to improve access to quality care for children and adolescents who have experienced traumatic events, their families, and communities throughout the U.S. The Children’s Health Act of 2000 (Public Law 106–310) authorizes Federal funding for public and nonprofit private entities, as well as to Indian tribes and tribal organizations, for developing programs focusing on the behavioral and biological aspects of psychological trauma response and for developing knowledge with regard to evidence-based practices for treating psychiatric disorders of children and youth resulting from witnessing or experiencing a traumatic event. Under this legislation, funding has been set aside for grantees to develop, evaluate and improve programs and interventions for trauma-exposed children and youth, and mandates that the effectiveness of programs be evaluated and the findings reported.
The NCTSI mission is carried out by the National Child Traumatic Stress Network (NCTSI or Network), a science-to-practice, collaborative network of over 100 current and previously funded grantees, or centers, that combines resources from academic institutions, hospitals, community-based agencies, schools, and other entities to develop and promote effective community practices for children and adolescents exposed to a wide array of traumatic events.
From its inception, a strong evaluation component has been incorporated into the NCTSI program, initially focused on evaluating individual grantees. To obtain consistent data from all sites and to meet program needs for Government performance reporting, a national evaluation has been implemented for the NCTSI program. This evaluation is designed to assess the effectiveness of NCTSI as a whole, including the initiative’s impact as a national resource for enhancing the standard and quality of care for children and their families affected by traumatic stress. Developed with extensive stakeholder input, this ongoing evaluation assesses the various NCTSI program functions, impact, and outcomes related to multiple domains of grantee and NCTSI program activity.
While the existing national evaluation provides information on key domains of activity for the NCTSI program, it is critical that future evaluation consider the evolution of the NCTSI PROGRAM and the development of various national evaluation and performance monitoring mechanisms that have become focused on the NCTSI program over the last six years, define updated evaluation priorities for the NCTSI program, and streamline evaluation efforts to remove outdated elements and ensure efficient yet comprehensive evaluation.
The proposed data collection activities will continue some of the previously cleared data collection efforts, discontinue others and expand to include other data collection activities that are closely aligned with SAMHSA’s recently released plan for achieving the goals of the agency’s eight new strategic initiatives entitled Leading Change: A Plan for SAMHSA's Roles and Actions 2011 – 2014 (see Section A2.e for additional detail). For example, the evaluation assesses NCTSI program progress in addressing SAMHSA strategic initiatives focused on improving the quality of behavioral health treatment for military families and on reducing the impact of trauma by integrating trauma-informed approaches throughout health and behavioral health care systems and diverting youth with mental disorders from juvenile justice systems into trauma-informed treatment and recovery.
Specifically, as a result of efforts to address updated evaluation priorities, reduce redundancy and consolidate multiple data collection efforts, the request proposes to discontinue ten surveys, forms or interviews that are currently OMB-approved (see Table 1(a)). In place of the ten surveys, forms or interviews, and as part of the redesigned evaluation, eight new data collection efforts are proposed (see Table 1(b)). This request also proposes to expand the currently OMB-approved methodology for the Core Data Set outcome study that samples 100 children per center per grant cycle and to limit the collection to the period of time while the client is receiving treatment. The specific request is to expand the outcome measures of the Core Data Set (Attachment C) to all clients receiving direct mental health services in NCTSI-funded centers. Administering the follow-up assessment (which occurs at three-month intervals, as before) to all children and youth receiving services will allow for a more comprehensive understanding of how treatment is beneficial through analysis of longitudinal data across subgroups of children, trauma experiences, and treatments received. This request also proposes to enhance the existing Core Data Set by revising the Core Clinical Characteristics Forms and adding new instruments (see Table 1(b)). Finally, this request proposes to continue some data collection using instruments tht are OMB-approved (see Table 1(c)). The proposed changes are described with additional context and background, and in more detail, throughout the statement, particularly in Sections A2.c and A2.d. They are also highlighted in Tables 1(a), 1(b) and 1(c) below.
Table 1(a). Currently OMB-Approved Data Collection: Propose to Discontinue
Instruments |
Relationship to Revised NCTSI Evaluation Design |
Youth Services Survey for Families (YSS-F) |
Satisfaction data from the TRAC system instead will be analyzed for evaluation purposes |
Provider Trauma-informed Services (TIS) Survey |
TIS as a construct will be assessed differently through newly proposed data collection (ETSC Survey) |
Product/Innovation Development and Dissemination Survey (PDDS) |
Elements of the PDDS are included in the newly proposed data collection (OPMR) |
Case studies |
|
Workgroup coordinator interviews |
|
General Adoption Assessment Survey (GAAS) |
Elements of the GAAS and AIFI are included in the newly proposed data collection (ETSC Survey and OPMR) |
Adoption/Implementation Factors Interview (AIFI) |
|
Network Survey |
Elements of the Network Survey and CTPT are included in the newly proposed data collection (OPMR) |
Child Trauma Partnership Tool (CTPT) |
|
Data collected through NREPP |
None |
Table 1(b). Newly Proposed Data Collection
Proposed New Instruments |
Key Domains of Instrument |
SAMHSA Goals |
Trauma Symptom Checklist for Young Children (TSCYC) (Attachment C.8) (Add to Core Data Set) |
|
Trauma and Justice Initiative
Data, Outcomes and Quality Initiative
Goals for the Evaluation
|
Parenting Stress Index Short Form (PSI-SF) (Attachment C.9) (Add to Core Data Set) |
|
Trauma and Justice Initiative:
Data, Outcomes and Quality Initiative:
Military Families Initiative:
|
Children’s Depression Inventory-2 Short (CDI-2S) (Attachment C.10) (Add to Core Data Set) |
|
Trauma and Justice Initiative:
Data, Outcomes and Quality Initiative:
Goals for the Evaluation:
|
Global Appraisal of Needs Modified Short Screener (GAIN-MSS) – (Attachment C.11) (Add to Core Data Set) |
|
Trauma and Justice Initiative:
Data, Outcomes and Quality Initiative:
Goals for the Evaluation:
|
Evidence-based Practice and Trauma-informed System Change Survey (ETSC) in versions for administrators (Attachment D.1) and providers (Attachment D.2) |
|
Trauma and Justice Initiative:
Data, Outcomes and Quality Initiative:
Goals for the Evaluation:
|
Training Sign-in Sheet (TSIS) (Attachment H) |
Participants provide:
|
This brief form simply facilitates participation in the ESTC survey (see above) |
Online Performance Monitoring Report Form (OPMR) for funded NCTSI centers (Attachment E) |
|
Trauma and Justice Initiative:
Data, Outcomes and Quality Initiative:
Military Families Initiative:
Goals for the Evaluation:
|
Sustainability Survey for funded (Attachment I.1) and affiliate centers (Attachment I.2) |
|
Trauma and Justice Initiative:
Data, Outcomes and Quality Initiative:
|
Table 1(c). Currently OMB-Approved Data Collection: Propose to Revise/Continue
Approved Instruments |
Key Domains |
SAMHSA Goals |
Core Clinical Characteristics (Baseline Assessment Form) Attachment C.1 |
|
Trauma and Justice Initiative
Data, Outcomes and Quality Initiative
Goals for the Evaluation
Military Families Initiative:
|
CBCL 1.5-5 and CBCL 6-18 (Achenbach, 2001; Achenbach & Rescorla, 2000) Attachment C.5 |
|
Trauma and Justice Initiative:
Data, Outcomes and Quality Initiative
Goals for the Evaluation:
|
TSCC-A (Briere, 1996)—abbreviated for NCTSI Attachment C.8 |
|
Trauma and Justice Initiative:
Data, Outcomes and Quality Initiative
Goals for the Evaluation:
|
UCLA-PTSD (Rodriguez, Steinberg, et al., 1999) Attachment C.6 |
|
Trauma and Justice Initiative:
Data, Outcomes and Quality Initiative:
Goals for the Evaluation:
|
Core Clinical Characteristics (Baseline Assessment Form), Core Clinical Characteristics (Follow-up Assessment Form) Attachment C.1& C.2 |
|
Trauma and Justice Initiative
Data, Outcomes and Quality Initiative
Goals for the Evaluation:
|
Core Clinical Characteristics (General Trauma Information Form), Core Clinical Characteristics (Trauma Detail Form) Attachment C.3 |
|
Trauma and Justice Initiative
Data, Outcomes and Quality Initiative
Goals for the Evaluation:
|
Training Summary Form (TSF) Attachment G |
|
Trauma and Justice Initiative:
Data, Outcomes and Quality Initiative:
Goals for the Evaluation:
Complete information is needed, particularly because the TSF provides data to assist SAMHSA in reporting on the GPRA indicator designed to assess the number of service providers who receive trauma-focused training and increases in the number trained over time. |
National Impact Survey (retitled “National Reach Survey”) Attachment F |
|
Trauma and Justice Initiative:
|
a. Background
Children’s experience of trauma and trauma-related disorders can occur as a result of events including child maltreatment, physical abuse, sexual abuse, and neglect; witnessing or experiencing community, domestic, or school violence; violent crimes such as kidnapping, rape, or murder of a parent; accidents and injury; terrorist acts or war-related events; witnessing or experiencing natural disasters such as floods, hurricanes, and fires; displacement and refugee trauma; medical trauma; and others (Pfefferbaum, 1997; Perrin, Smith, & Yule, 2000). Despite significant efforts aimed at prevention over the past 30 years, child abuse remains the most common type of major childhood trauma, and its impact in society is pervasive (Chadwick Center for Children and Families, 2004).
Recently, the National Research Council and the Institute of Medicine (2009), in the report Preventing Mental, Emotional, and Behavioral Disorders Among Young People: Progress and Possibilities, urged the Federal Government to make preventing mental, emotional, and behavioral disorders and promoting mental health in young people a national priority. Because child trauma often results in intense suffering and can have life-altering negative consequences and far-reaching impact, it is critically important to identify children in need of trauma-informed therapy early and to connect them with high-quality mental health treatment and resources. In recent years, a number of effective clinical approaches and interventions have been developed and tested for use in treating children and adolescents who have experienced trauma. Recent publications addressing the status of child trauma research (Chadwick Center for Children and Families, 2004; Saunders, Berliner, & Hansen, 2004; Mash & Hunsley, 2005; Wilson & Saunders, 2005; Amaya-Jackson & DeRosa, 2007; Ford & Cloitre, 2009) describe significant progress in this regard. For example, researchers have begun to gather evidence about which treatment strategies are the most effective for different types of trauma. However, while the evidence base for treatment and intervention is growing, the gap between what has been identified by researchers and leading practitioners in the field of child traumatic stress as effective and what clinicians working in community mental health centers and other providers actually practice is so wide that multiple reports have called it a “chasm” (Chadwick Center for Children and Families, 2004; National Research Council & Institute of Medicine, 2009).
As research and recent national reports have suggested, without a coordinated and sustained effort to address the gaps in children’s mental health research and treatment, many children will miss an opportunity for care and recovery from traumatic experiences, as well as a chance “to live, work, learn, and participate fully in their communities” (New Freedom Commission on Mental Health, 2003, p. 1).
The Donald J. Cohen National Child Traumatic Stress Initiative (NCTSI) is a national initiative to bridge the gap between research and practice in the child trauma field. The mission of the NCTSI is to raise the standard of care and improve access to services for trauma-exposed children and adolescents, their families, and communities throughout the United States. The program was authorized on October 17, 2000, under the Children’s Health Act of 2000 (Public Law 106–310) and has been further informed and guided by the final report of the New Freedom Commission on Mental Health (2003). NCTSI also addresses Healthy People 2010 focus area 18: Mental Health and Mental Disorders, which sets as a national goal the expansion of treatment for children with mental health problems including psychopathology resulting from exposure to traumatic experiences.
The NCTSI mission is carried out by the National Child Traumatic Stress Network (NCTSN or Network), a science-to-practice, collaborative network of grantees, or centers, that combines resources from academic institutions, hospitals, community-based agencies, schools, and other entities to develop and promote effective community practices for children and adolescents exposed to a wide array of traumatic events. Among other activities, the centers implement trauma-informed clinical interventions, collect data for quality improvement and clinical purposes, disseminate information about child trauma and effective interventions, provide training on child trauma for service providers within and beyond the Network, and facilitate collaboration among child-serving providers and systems. Through such activities, particularly through best-practice service delivery for children representing diverse geographic, demographic, and clinical characteristics as well as continuous evaluation of such practices, the NCTSI program has great potential to expand the limited evidence base on children’s experience of trauma and translate science to practice, thereby closing the quality chasm. Since 2001, competitive funding for individual centers has been awarded by SAMHSA CMHS. The NCTSI program has been administered through CMHS’s Emergency Mental Health and Traumatic Stress Services Branch.
The centers that make up the Network fall into three distinct categories:
Category I—The National Center for Child Traumatic Stress includes two lead grantees that collaborate with SAMHSA to serve as the Network’s national coordinating center. This center mainly provides oversight and coordination of NCTSI activities.
Category II—Treatment and Service Adaptation Centers provide national expertise regarding trauma-specific treatments and interventions for diverse clinical and demographic populations. These centers specifically support the adaptation of effective treatment and service approaches for centers that provide direct clinical services (i.e., Category III centers).
Category III—Community Treatment and Services Centers primarily provide direct mental health services to children and their families, but also implement and evaluate interventions in community-based settings.
Duke University and the University of California–Los Angeles (UCLA), have partnered to form the National Center for Child Traumatic Stress (NCCTS) since the Network’s inception in 2001; these universities were re-funded during competitive award cycles in 2006 and 2009. Meanwhile, membership of Category II and Category III centers has fluctuated as a result of changes in Federal funding and regular competitive award cycles. To date, SAMHSA has funded NCTSI grantees through either 3- or 4-year cooperative agreements. To sustain the progress accomplished by previously funded cohorts, SAMHSA began re-funding centers beginning in 2005, including the NCCTS. By 2010, the NCTSN was composed of 62 funded centers and numerous Category II and Category III alumni. Alumni members are previously funded grantees that received awards in 2001, 2002, 2003, 2004, or 2005 (or individuals who represent previously funded centers) and that continue to participate actively in NCTSI activities.
From its inception, a strong evaluation component has been incorporated in the NCTSI program, initially focused on evaluating individual grantees. To obtain consistent data from all sites and to meet program needs for Government performance reporting, a national evaluation has been implemented for the NCTSI program. This evaluation is designed to assess the effectiveness of NCTSI as a whole, including the Network’s impact as a national resource for enhancing the standard and quality of care for children and their families affected by traumatic stress. Developed with extensive stakeholder input, this ongoing evaluation assesses the various NCTSI functions, impact, and outcomes related to multiple domains of grantee and NCTSI activity. While the existing national evaluation provides information on key domains of activity for the NCTSI, it is critical that future evaluation consider the evolution of the NCTSI and the development of various national evaluation and performance monitoring mechanisms that have become focused on the NCTSI over the last six years, define updated evaluation priorities for the NCTSI program, and streamline evaluation efforts to remove outdated elements and ensure efficient yet comprehensive evaluation.
The proposed data collection activities will continue some of the previously cleared data collection efforts, discontinue others, and expand to include other data collection activities that are closely aligned with SAMHSA’s recently released plan for achieving the goals of the agency’s eight new strategic initiatives entitled Leading Change: A Plan for SAMHSA's Roles and Actions 2011 – 2014 (see Section A2.e for additional detail). For example, the evaluation assesses NCTSI program progress in addressing SAMHSA strategic initiatives focused on improving the quality of behavioral health treatment for military families and on reducing the impact of trauma by integrating trauma-informed approaches throughout health and behavioral health care systems and diverting youth with mental disorders from juvenile justice systems into trauma-informed treatment and recovery. To that end, the revised evaluation includes greater emphasis on assessing:
The implementation of evidence-based, trauma-informed clinical treatment among NCTSI centers and the facilitators and barriers to the implementation process;
Training, consultation, partnership and other activity among NCTSI centers focused on disseminating trauma-informed approaches beyond the NCTSI to organizations and agencies within major child-serving systems (i.e., child welfare, juvenile justice, mental health, primary care and education);
The impact of NCTSI training and consultation activity on changes in practice in child-serving systems, particularly the extent to which agencies and organizations in such systems become more evidence-based and trauma-informed in their routine activities; and,
The descriptive characteristics and clinical outcomes of children and adolescents provided clinical mental health services by the NCTSI, including greater focus on assessing previously underrepresented subpopulations such as: military families, refugees, and children under the age of six.
To date, the NCTSI program has resulted in one of the largest data collection efforts in the world on the descriptive characteristics and clinical outcomes of trauma-exposed children, a chronically understudied area; thus, this evaluation and its findings, as well as the rigor and utility of the evaluation, are particularly critical.
b. The Need for Evaluation
The NCTSI Evaluation is essential as it offers an enhanced understanding of the way in which this innovative program impacts children’s mental health services and access to care, and ultimately, the lives of children, adolescents, and families. Evaluation data provide the information necessary for shaping and influencing program and policy development through the systematic analysis and aggregation of information across the components of large-scale initiatives, thus, contributing to an understanding of overall program effectiveness. Moreover, as challenging as evaluation of large-scale multisite initiatives like the NCTSI can be, without comprehensive evaluation information, the implementation of programs cannot be monitored effectively and their expected outcomes and large-scale product dissemination may be difficult to identify.
2. Purpose and Use of Information
What follows is a description of the previously approved clearance, the clearance request revisions, and the revised NCTSI evaluation; a summary of the revisions from the previously approved package; and a description of the uses of the information collected through the evaluation.
Previously Approved Clearance
Currently, data collection for the NCTSI cross-site evaluation is operating under OMB clearance (OMB No. 0930-0276) valid until July 31, 2012. The evaluation design includes participation among NCTSI centers in one or more of eight study components that employ both qualitative and quantitative methods to comprehensively examine the impact of the NCTSI program. This evaluation provides the opportunity to advance the understanding of clinical outcomes among children served in the NCTSI, systematically assess the development and dissemination of evidence-based treatments, and examine in greater detail specific efforts and goals of the NCTSI. The eight currently approved study components are summarized in Table 2 below.
Table 2. Currently OMB-Approved Cross-site Evaluation Data Collection Activities
Cross-site Evaluation Components |
Associated Instruments |
Purpose and Methods |
Descriptive and Clinical Outcomes |
Core Data Set (CDS) |
To address the Government Performance and Results Act of 1993 (GPRA) goals of increasing access to services and improving outcomes, the purpose of this study is to identify and describe the children and families served by NCTSI centers and to measure the extent to which their outcomes improve over time. This goal is supported through centers’ collection of CDS information on children receiving direct mental health services. Center intake staffs collect cross-sectional descriptive data from all clients receiving direct mental health services at the point of entry into service. Specifically, descriptive study data include child and family demographic and psychosocial information, child traumatic experience information, and associated problem information. In addition, centers choose approximately 25 children per year to participate in the longitudinal clinical outcomes study. Follow-up data are collected from these children every 3 months to assess outcomes after the initial treatment begins, regardless of whether the child continues to receive services. |
Consumer Satisfaction with Direct Mental Health Services |
Youth Services Survey for Families (YSS-F) |
Addressing the New Freedom Commission on Mental Health (2003)’s emphasis on consumer-driven care, the component assesses the NCTSI’s goal of increasing access and capacity of trauma services for children and their families with an examination of service satisfaction among clients receiving direct clinical mental health services. Specifically, this survey includes domains focused on: access, participation in treatment, cultural sensitivity, appropriateness/client satisfaction, and outcomes. It is administered to caregivers of children who have received direct mental health services from an NCTSI center at the close of treatment or at 6 months into treatment, whichever occurs first. |
Knowledge and Use of Trauma-Informed Services |
Provider Trauma-informed Services (TIS) Survey |
This component assesses the extent to which NCTSI centers enhance trauma-informed services knowledge and use among service providers affiliated with the NCTSI. NCTSI centers hold training and outreach events for service providers designed to promote evidence-based, trauma-informed practices, and they administer the TIS Survey to trainees following training events. The TIS Survey contains primarily closed-ended questions (yes/no and Likert-scale items) intended to assess: types of individuals trained by NCTSI centers, types of trainings offered, change in provider knowledge or intention to use TIS, and satisfaction with the training. Centers also collect information about the trainings provided and individuals attending the trainings by completing the TSF at each training event. |
TIS Training Summary Form (TSF) |
||
Product/ Innovation Development and Dissemination |
Product/Innovation Development and Dissemination Survey (PDDS) |
As the NCTSI creates hundreds of products designed to improve trauma-informed practices and to support children and their families, this component evaluates NCTSI activity in promoting the development and dissemination of assessments, interventions, information and training resources, publications, and other products. Specifically, this component collects information about product development and dissemination through centers’ quarterly progress reports and the combined fourth quarter/annual report, both of which are completed by center project directors, and through a qualitative investigation which includes case studies on the development of specific products and interviews on dissemination processes. |
Case Studies |
||
Workgroup Coordinator Interviews |
||
Adoption of Methods and Practices |
General Adoption Assessment Survey (GAAS) |
This component is designed to evaluate the extent to which trauma-informed practices, knowledge, methods, and products, particularly products created or disseminated by the NCTSI, are being adopted by NCTSI centers and non-NCTSI partners. The information obtained through this study enhances understanding of the pathways through which adoption and implementation occur, common barriers, and best practices leading to successful adoption and implementation. The study design consists of two stages of annual data collection: (1) a Web-based survey of all centers to identify frequently adopted, trauma-related products such as evidence-based clinical treatments, and (2) in-depth interviews with a subset of centers to collect qualitative information about factors affecting adoption and implementation. |
Adoption/Implementation Factors Interview (AIFI) |
||
Network Collaboration |
Network Survey |
This component measures the extent and nature of collaboration among centers by examining how collaboration is used as a conduit for sharing and transferring knowledge, resources, and technology to achieve NCTSI goals. Data are collected from key personnel at each NCTSI center in alternating years of the evaluation via a survey about the extent to which they interact with every other center on select key activities, such as governance and decision making; information and resource sharing; coordinating activities; product/innovation development; professional training; consumer and client training; and increasing public awareness. A second survey is administered in the off years to quantify the activities and impact of formal collaboration structures in the NCTSI. |
Child Trauma Partnership Tool (CTPT) |
||
National Impact |
National Impact Survey |
This component examines the extent to which the existence of the NCTSI has impacted trauma-informed services information, knowledge, policy, and practices among mental health and non–mental health child-serving agencies external to the NCTSI. The Web-based, annual National Impact Survey collects data about these agencies’ knowledge and awareness of childhood trauma and practices, about their knowledge and connections to the NCTSI centers, and about their policies, practices, and programs targeted to children and adolescents who have been exposed to traumatic experiences. |
National Registry of Evidence-based Programs and Practices (NREPP) |
Data collected through NREPP |
NREPP was created by SAMHSA’s Center for Substance Abuse Prevention as part of an effort to help policymakers, consumers, and providers learn more about science-based prevention programs and as a mechanism for disseminating such programs to the field. The cross-site evaluation tracks the progress of grantees in submitting practices for NREPP review as well technical assistance provided by the centers. In addition, grantees are monitored in the field and through NREPP to confirm that evidence-based programs developed by or through the NCTSI have been submitted to NREPP for review and possible inclusion in the registry or that centers are working toward such a submission. |
Clearance Request Revisions
SAMHSA is requesting approval for continuation of, and revisions to, the previously approved NCTSI evaluation package (OMB No. 0930-0276). Drawing upon our experience of five years of data collection for the evaluation as well as feedback from grantees and other stakeholders, we have made improvements to the data collection instruments in order to reduce response burden, maximize utility of the data for all stakeholders and deepen our understanding and knowledge of particular priority areas in the field of child traumatic stress. Revisions to the NCTSI cross-site evaluation (renamed “NCTSI evaluation”) are summarized in Sections A2.c and A2.d.
NCTSI Evaluation Design and Data Collection Instruments
The evaluation has been and will continue to be focused on the organization, collaborative efforts, function, and impacts of the NCTSI as a whole rather than designed to assess the effectiveness of specific programs or interventions. NCTSI evaluation data will be used to:
determine the extent to which the NCTSI program has been able to achieve its goal of improving mental health services and access to care for trauma-exposed children and adolescents and their families, while improving the evidence base on trauma-informed care;
assist the NCTSI centers in better meeting their goals;
focus technical assistance and support; and
ensure accountability to stakeholders, including Federal agencies and the children and families served by the NCTSI, by informing them of progress made by the NCTSI nationwide.
In comparison to the previous evaluation design, the redesigned evaluation includes greater emphasis on the evaluation of training activity conducted by NCTSI centers and the impact on the various child-serving systems (e.g., mental health, child welfare, juvenile justice, education and primary care) that partner with the NCTSI, in terms of whether services provided through these systems become more evidence-based and trauma-informed. Addressing SAMHSA’s strategic initiative plan emphasizing the importance of increasing access to behavioral health care, this design also includes increased focus on evaluating access to high quality, trauma-informed care for trauma-exposed children and adolescents, disparities in access to care across demographic groups, and an array of related issues including waitlists for services, consumer satisfaction with services, and a comparison of access issues within and external to the NCTSI. Also based on stakeholder feedback, the evaluation includes for the first time a focus on assessing the sustainability of grant activities after funding has ended. Finally, similar to the existing design, the revised evaluation includes an emphasis on evaluating the national impact of the NCTSI.
The evaluation design has been improved and strengthened in many respects. For example:
The evaluation design is focused on updated evaluation priorities as identified by stakeholders including a 21-member evaluation steering committee and SAMHSA; this group provided guidance to update each aspect of the evaluation design including evaluation indicators, methodology, analyses, and reporting processes.
The evaluation has been streamlined and response burden for participants reduced through a review and comparison of multiple existing efforts to monitor and evaluation the NCTSI program. In addition, the evaluation allows for rapid, low-burden, electronic data entry that can be collected from professionals with limited data collection time or resources.
Moreover, enhanced mechanisms for reporting useful data profiles, summaries, and/or reports have been developed to support quality improvement activities for NCTSI centers.
This evaluation serves multiple practical purposes: 1) to collect and analyze descriptive, outcome, and service experience information about the children and families served by the NCTSI; 2) to assess the NCTSI’s impact on access to high-quality, trauma-informed care; 3) to evaluate NCTSI centers’ training and consultation activity designed to promote evidence-based, trauma-informed services and the impact of such activity on child serving systems; and 4) to assess the sustainability of the grant-funded activities to improve access to and quality of care for trauma-exposed children and their families beyond the grant period. The various components of the revised NCTSI Evaluation and associated instrumentation are described in Attachment A.
Summary of Specific Revisions
Below is a summary of the aspects of the currently OMB-approved cross-site evaluation (renamed “NCTSI evaluation”) that are proposed to continue, the proposed revisions and the rationale behind each of the changes. Tables 3(a) and 3(b) also provide highlights of these proposed changes.
Evaluation Continuation
The request proposes to continue data collection using:
The Core Data Set (Attachment C), with some revisions, additional instruments and expanded target population, further described in the “Evaluation Expansion” section below.
The Training Summary Form (Attachment G) with some revisions as part of the new Training, Evidence-based Practices and Family Partnerships component (for a full description of the components of the revised NCTSI Evaluation, see Appendix B). Based on stakeholder feedback, the following revisions have been made to the form:
Items asking about duration of training have been simplified and clarified;
The organizational roles for child welfare agency training participants now include birth parent(s) and youth;
An item was added allowing trainers to identify which specific assessments or interventions were the focus of the training;
A number of items asking about the degree to which various topics were included in the training were simplified and clarified, and several redundant items were deleted; and,
The item asking whether the training would be evaluated was clarified.
The National Impact Survey (Attachment F) with some revisions as part of the new Access to High Quality, Trauma-informed Services evaluation component (for a full description of the components of the revised NCTSI Evaluation, see Attachment A) and a new title: “NCTSI National Reach Survey.” Based on stakeholder input, the following revisions have been made to the instrument:
The respondent/agency information has been moved to the end of the survey;
The survey begins with a brief description of the NCTSI before assessing respondents’ familiarity with the NCTSI;
Items on collecting data elements (#12) and advocacy (#14) were dropped from the survey; and,
An item on familiarity with NCTSI products and the products implemented was added.
These forms and surveys, which are included in the appendix, have been annotated with yellow highlighting to show changes and deletions.
Evaluation Reduction
As a result of efforts to address updated evaluation priorities, reduce redundancy and consolidate multiple data collection efforts focused on national monitoring and evaluating of the NCTSI program, the request proposes to discontinue ten surveys, forms or interviews that are currently OMB-approved (see Table 3(a)).
Evaluation Expansion
The original OMB clearance for the evaluation was requested and approved for the first 3 years of the evaluation. Similarly, for this clearance, respondent burden is calculated for the 3 years following clearance of the revised NCTSI evaluation.
The number of centers for which burden was calculated is 62, which represents the number of currently active grantees (the number of centers at the time of the previous submission was 44).
This request proposes to expand the currently OMB-approved methodology for the outcome study that samples 100 children per center per grant cycle and to limit the collection to the period of time while the client is receiving treatment. Specifically, the request is to expand the outcome measures of Core Data Set (Attachment C) to all clients receiving direct mental health services in NCTSI centers. Administering the follow-up assessment (which occurs at three-month intervals, as before) to all children and youth receiving services will allow for a more comprehensive understanding of how treatment is beneficial through analysis of longitudinal data across subgroups of children, trauma experiences, and treatments received.
This request also proposes to enhance the existing Core Data Set by revising the Core Clinical Characteristics Forms and adding new instruments. The Core Data Set includes the following currently OMB-approved set of instruments:
Core Clinical Characteristics Forms (Baseline and Follow-up) (Attachments C.1-C.2)
Trauma Information/Trauma Detail Form (Attachment C.3)
Child Behavior Checklist 1.5-5/6-18 (CBCL 1.5-5/6-18), now including the “competency” section on pages 1 and 2 of the CBCL 6-18 (Attachments C.4-C.5)
UCLA-PTSD Short Form (UCLA-PTSD) (Attachment C.6)
Trauma Symptoms Checklist for Children-Abbreviated (TSCC-A) (Attachment C.7)
The Core Clinical Characteristics forms (Attachments C.1-C.2) will be revised to include information on the following topics:
Military families
Refugees
Family functioning using the Family APGAR (Smilkstein, 1978)
Additional indicators of symptom severity and services relevant for children under age six
The request proposes to add the following new instruments to the Core Data Set to address existing gaps in knowledge:
Trauma Symptom Checklist for Young Children (TSCYC) (Attachment C.8)
Parenting Stress Index Short Form (PSI-SF) (Attachment C.9)
Children’s Depression Inventory-2 Short (CDI-2S) (Attachment C.10)
Global Appraisal of Needs Modified Short Screener (GAIN-MSS) (Attachment C.11)
The CDS has been designed such that grantees can opt in or out of administration of certain instruments. All participating centers administer the Core Clinical Characteristics Forms and the Trauma Information and Trauma Detail Forms. The vast majority of centers have elected to administer the existing clinical instruments (CBCL, UCLA PTSD Short Form, and the TSCC-A). However, some centers have chosen to opt out of one or more of these clinical instruments, and we will continue to allow that option. For many of the new components of the Core Clinical Characteristics forms, the new modules will only be triggered for cases who meet certain screening criteria. For example, all cases will have data on whether or not they have a family member in the military. Only those with affirmative answers will see the additional military family module. All of the additional clinical instruments (TSCYC, PSI-SF, CDI-2S, and the GAIN-MSS) are either optional or only relevant for smaller subpopulations. For example, the TSCYC is only relevant for youth aged 3 through 7 (older youth complete the TSCC-A). Many centers serving these young children are already administering the TSCYC locally, so we anticipate that most will choose to use this new instrument. The PSI-SF and the CDI-2S are relevant to larger populations, but are optional (we estimate that 50% of grantees will opt to use the instruments). The GAIN-MSS is only relevant for youth aged 12 and older. The first several questions ask whether there has been any drug or alcohol use. If not, the remainder of the instrument will not be required.
In place of the ten surveys, forms or interviews that are currently OMB-approved that are being discontinued (see Table 3(a)), and as part of the redesigned evaluation, three new data collection efforts are proposed (see Table 3(b)), including:
Online Performance Monitoring Report Form (OPMR) (Attachment E)
Evidence-based Practice and Trauma-informed System Change Survey (ETSC) in versions for administrators (Attachment D.1) and providers (Attachment D.2)
Sustainability Survey in versions for funded centers (Attachment I.1.) and affiliate centers (Attachment I.2)
The OPMR is primarily a mechanism for SAMHSA to monitor centers’ progress towards achieving stated goals and a fulfillment of SAMHSA requirements for accountability and performance monitoring. In addition, as a result of collaborative efforts to reduce data collection requirements for NCTSI grantees, this form will also serve as an important data source informing the NCTSI evaluation. The form incorporates evaluation domains that were previously a part of either the currently OMB-approved evaluation or other monitoring and evaluation efforts conducted by SAMHSA or the NCCTS, each of which impacted grantees. For example, the OPMR incorporates elements of five currently OMB-approved cross-site evaluation instruments (PDDS, Network Survey, CTPT, GAAS, and AIFI). Highlights of such elements that have remained in the OPMR include assessment of:
Types of products developed, target population, provider type, and stage of development (from the PDDS)
Collaborative activities between and among NCTSI centers (from the Network Survey)
Centers’ engagement with formal workgroups/committees across the network (from the CTPT survey)
Facilitators and barriers to evidence-based practice implementation (from the GAAS survey and AIFI interview)
While the Sustainability Survey is entirely new and has been added in response to stakeholder requests, the ETSC also incorporates prioritized elements of two currently OMB-approved data collection efforts (GAAS and AIFI). In addition, the Sustainability Survey for funded centers is included in the OPMR, while the Sustainability Survey for affiliate centers is administered independently. By coordinating data collection for multiple purposes and focusing the evaluation on updated priorities, the evaluation reduces the time required of grantees to report on program activities.
A Training Sign-in Sheet (TSIS) (Attachment H) has also been developed for use at each training event sponsored by NCTSI centers. The purpose of the form is to collect contact information from training participants and to provide background information about the NCTSI Evaluation. Specifically, this very brief form provides information about the ESTC Survey, asks participants if they would be willing to be contacted to participate at a later date, and collects participant contact information.
Table 3(a). Currently OMB-Approved Data Collection, with OMB Action Requested
Currently OMB-Approved Cross-site Evaluation Components |
Instruments |
OMB Action Requested |
Relationship to Revised NCTSI Evaluation Design |
Descriptive and Clinical Outcomes |
Core Data Set (CDS) |
Continue, using revised approach |
Currently approved to collect longitudinal outcomes for 100 clients per center per grant cycle; Requesting to expand follow up data collection to all clients receiving direct mental health services and to limit the collection to the period of time while the client is receiving treatment; Requesting approval for revisions to the Core Clinical Characteristics forms and for additional instruments (listed above and in Table 3(b)). |
Consumer Satisfaction with Direct Mental Health Services |
Youth Services Survey for Families (YSS-F) |
Discontinue |
Satisfaction data from the TRAC system instead will be analyzed for evaluation purposes |
Knowledge and Use of Trauma-Informed Services |
Provider Trauma-informed Services (TIS) Survey |
Discontinue |
TIS as a construct will be assessed differently through newly proposed data collection (ETSC Survey); the revised TSF will be used as part of the new Training, EBP, and Family Partnerships component (reviewed in Attachment A) |
Training Summary Form (TSF) |
Continue, using revised approach |
||
Product/ Innovation Development and Dissemination |
Product/Innovation Development and Dissemination Survey (PDDS) |
Discontinue |
Elements of the PDDS are included in the newly proposed data collection (OPMR) |
Case studies |
Discontinue |
||
Workgroup coordinator interviews |
Discontinue |
||
Adoption of Methods and Practices |
General Adoption Assessment Survey (GAAS) |
Discontinue |
Elements of the GAAS and AIFI are included in the newly proposed data collection (ETSC Survey and OPMR) |
Adoption/Implementation Factors Interview (AIFI) |
Discontinue |
||
Network Collaboration |
Network Survey |
Discontinue |
Elements of the Network Survey and CTPT are included in the newly proposed data collection (OPMR) |
Child Trauma Partnership Tool (CTPT) |
Discontinue |
||
National Impact |
National Impact Survey |
Continue, using revised approach and new survey title (NCTSI National Reach Survey) |
A revised version of the National Impact Survey will be used as part of the new Access to High Quality, Trauma-informed Services component (reviewed in Attachment A) |
National Registry of Evidence-based Programs and Practices (NREPP) |
Data collected through NREPP |
Discontinue |
None |
Table 3(b). OMB Action Requested Related to Evaluation Expansion
Revised NCTSI Evaluation Components |
Instruments |
Relationship to Revised NCTSI Evaluation Design |
|
Descriptive and Clinical Outcomes |
Added to the Core Data Set:
|
The CDS has been designed such that grantees can opt in or out of administration of certain instruments. All of the additional clinical instruments (TSCYC, PSI-SF, CDI-2S, and the GAIN-MSS) are either optional or only relevant for smaller subpopulations, as described elsewhere in the statement. |
|
Access to High Quality, Trauma-informed Services
|
Evidence-based Practice and Trauma-informed System Change Survey (ETSC) in versions for administrators (Attachment D.1) and providers (Attachment D.2) |
Online Performance Monitoring Report Form (OPMR) for funded NCTSI centers (Attachment E)
|
To a greater extent than the previous evaluation, individual data collection instruments will be used to address multiple components of the NCTSI Evaluation; in addition, the evaluation will draw from data sources used for other purposes. For example, the OPMR—primarily a mechanism for SAMHSA to monitor centers’ progress towards achieving stated goals—will also serve as an important data source informing three components of the revised NCTSI evaluation. |
Training, Evidence-based Practices (EBPs), and Family/Consumer Partnerships |
|||
Sustainability |
Sustainability Survey for affiliate centers (Attachment I.2) |
Uses of information collected through the NCTSI Evaluation
NCTSI Evaluation data and reports have been, and will continue to be, used by multiple stakeholders, including SAMHSA, CMHS Directors, and Grant Project Officers (GPOs), grantees, the practice community, and the research community.
SAMHSA
SAMHSA has been, and will continue to be, able to use the results from the evaluation to monitor centers’ progress towards achieving stated goals, fulfill SAMHSA requirements for accountability and performance monitoring including reporting for GPRA (see additional description of accountability issues below), and develop policies and provide guidance regarding the development of the NCTSI. In the future, this data collection may also allow SAMHSA to plan and implement other efforts designed to address the prevalence and impact of trauma.
In addition, in 2010, to guide its work through at least 2012, SAMHSA identified eight strategic initiatives with input from stakeholders including Federal, state and local leaders; constituency groups; advisory council members; members of Congress; people in recovery; and family members. These initiatives are designed to focus SAMHSA’s work on improving lives and capitalizing on emerging opportunities. In particular, the NCTSI evaluation responds to the following three strategic initiatives:
Trauma and Justice Initiative: SAMHSA is one of the leading agencies addressing the impact of trauma on individuals, families and communities across the country. Thus, one of the eight Strategic Initiatives—“Trauma and Justice”—is designed:
“to focus programmatic efforts on the goal of reducing the pervasive, harmful, and costly health impact of violence and trauma by integrating trauma-informed approaches throughout health and behavioral health care systems and by diverting people with substance use and mental disorders from criminal and juvenile justice systems into trauma-informed treatment and recovery.”
The “Trauma and Justice” strategic initiative includes five goals with imbedded objectives and action steps. Of those, the NCTSI program and data collection associated with the redesigned NCTSI Evaluation contribute most specifically to the following:
Building a trauma-informed behavioral health system
Reducing the impact of trauma
Supporting programs to address trauma experienced in childhood
Improving the availability of trauma-informed care
Data, Outcomes and Quality Initiative: SAMHSA has highlighted the importance of supporting programming decisions with high quality data and of transparency in these decisions by making data readily available to the public. The objective of the initiative is:
“to realize an integrated data strategy that informs policy and measures program impact leading to improved quality of services and outcomes for individuals, families and communities.”
The initiative includes four goals with imbedded objectives and action steps. Of those, the NCTSI Evaluation is guided by the following:
Improving the quality of SAMHSA’s program evaluations and services research
Improving the quality and accessibility of surveillance, outcome/performance, and evaluation information for staff, stakeholders, funders and policymakers.
Military Families Initiative: SAMHSA is focused on improving access to high quality, evidence-based treatment for military families including trauma-exposed children and adolescents. The objective of the initiative is:
“to facilitate innovative community-based solutions that foster access to evidence-based prevention, treatment, and recovery support services for military service members, veterans, and their families at risk for or experiencing mental and substance use disorders through the provision of state of the art technical assistance, consultation, and training.”
The initiative includes four goals with associated objectives and action steps. Of those, the NCTSI Evaluation is guided by the following:
Improving the quality of behavioral health prevention, treatment and recovery support services by helping providers respond to the needs and culture of military families
Promoting the behavioral health of military families with programs and evidence-based practices that support their resilience and emotional health
In sum, in its design and through its established priorities and data collection approach, the NCTSI Evaluation will provide data that will allow SAMHSA to assess and illustrate the ways in which, as well as the extent to which, the NCTSI program has achieved goals in areas of urgency and opportunity as outlined in SAMHSA’s Strategic Initiative.
CMHS Leadership
CMHS leadership has been, and will continue to be, able to use NCTSI evaluation data reported by grantees to determine whether funded activities are progressing as expected and to keep abreast of any issues that grantees are having related to carrying out their proposed activities. In the future, due to the enhanced evaluation design, particularly consolidation of items related to collaboration in the OPMR, Government Project Officers (GPOs) may also use the information to connect grantees who are conducting similar activities or serving comparable populations to facilitate collaboration across the NCTSI.
In addition, the design for the NCTSI evaluation provides for data collection, summarization, analysis, and reporting that can be used to address SAMSHA/CMHS priorities including:
Accountability: The evaluation was designed in part to support SAMHSA/CMHS performance measurement and management efforts. Findings from the evaluation have been, and will continue to be, used to provide objective measures of NCTSI program progress toward meeting targets of key performance indicators put forward in its annual performance plans as required by law under the Government Performance and Results Act of 1993 (GPRA). Accountability to stakeholders is achieved through standardized SAMHSA and Federal Government reporting requirements outlined in GPRA, which are addressed in part through the CMHS TRAC system and in part through the NCTSI Evaluation. GPRA indicators for the NCTSI program include:
Traumatized children/adolescents who receive services through the NCTSI program show improvement in their outcomes as a result of these services;
The NCTSI program succeeds in increasing access to trauma treatment and services as indicated by increases in the number of children who receive such services over time; and,
The NCTSI is successful in disseminating trauma treatment and services as indicated by the number of service providers who receive trauma-focused training and increases in the number trained over time.
Collecting, analyzing, and reporting data efficiently to satisfy requirements for these types of accountability considerations is a major requirement for future NCTSI evaluation. To that end, the revised evaluation:
Incorporates data collection for evaluation and performance monitoring purposes to address the GPRA indicators outlined above;
Reduces redundancy where appropriate between TRAC and NCTSI Evaluation data elements (e.g., the NCTSI Evaluation will use satisfaction data collected through TRAC rather than require grantees to participate in a separate data collection effort on consumer satisfaction); and,
Facilitates grantee capacity to collect and report data for TRAC through use of the NCTSI Evaluation infrastructure which includes an electronic data center and online data collection and reporting system (described further in section A.3) as well as intensive training and technical assistance for grantees on data collection and reporting processes for both TRAC and the NCTSI Evaluation.
Quality improvement: Mechanisms for reporting useful data profiles, summaries, and/or reports have been developed to support quality improvement activities for Network clinical interventions, other products, and training/dissemination efforts and to serve as an incentive for data collection by data providers.
Program justification purposes: Program justification requires indicators not only of the effectiveness of activities and products in the abstract or in the published literature, but also of wide distribution and actual uptake of the activities and products, and evidence that they are effective, cost-effective and sustainable in communities throughout the country. With its increased emphasis on assessing the sustainability of grant activities after funding has ended and assessing the national impact of the NCTSI and the impact of the NCTSI on child-serving systems, this evaluation provides the data needed to assess program justification.
Grantees
Findings from the evaluation have been, and will continue to be, used by grantees to improve the services, processes, and functions of their centers. Demographic and outcome data on children and families who participate in the Network aid grantees in identifying the program elements that help children and families function better and that lead to client satisfaction. Grantees can use the information gathered to better identify their target populations and improve their services. The information also assists grantees in better understanding disparities in access to services for different subgroups of children and youth served by the NCTSI so that these disparities might be addressed. Grantees can also use data on lessons learned and strategies to accomplish evidence-based practice implementation and enhancement of trauma-informed services. Finally, they can use data on the factors facilitating and hindering efforts to promote the sustainability of NCTSI program activities after Federal funding has ended.
Research community
The research community, particularly the field of children’s mental health services research, will continue to profit in a number of ways from the information gathered. First, evaluation of the NCTSI adds significantly to the developing research base about the use of trauma-informed services. Second, the focus on child and family outcomes allows researchers to examine and understand who is being treated for trauma-related problems and the outcomes of that treatment. Third, assessment of the process by which evidence-based trauma services and processes are developed, disseminated, and adopted contributes to understanding the barriers and facilitators that affect this process. Finally, the analysis of evaluation data aids researchers in formulating new questions about the NCTSI and helps both service providers and researchers improve the delivery of children’s trauma services.
Summary
The NCTSI evaluation data and related reports produced will be useful to SAMSHA, CMHS GPOs and leadership, grantees and the research community. At the local level, centers will be able to track activities funded by their NCTSI grants and provide summary reports to their local steering committees or other advisory boards. Both SAMHSA GPOs and the NCTSI evaluation team will have access to reports that list all centers, with key information reported quarterly to allow for comparisons across centers in the Network. The NCTSI evaluation online reporting system will provide access to aggregate summary reports on Network-wide training initiatives, number of trainings and number of professionals trained, number of clients served, and trauma-informed practices and interventions reported by centers.
At all levels of government—Federal, State, and local—and in the private sector, decisions are being made that are dramatically changing the lives of children and families. To make these decisions in a responsible way, policymakers, centers, and other stakeholders need information such as the data and findings to be produced by the NCTSI Evaluation.
3. Use of Information Technology
State-of-the art electronic data collection including the use of Web-based surveys and forms is a major feature of the redesigned evaluation, as reflected in each of the four new evaluation components. To ease data collection and reporting, the NCTSI Evaluation will host and maintain a sophisticated data repository and online reporting system designed specifically for this initiative. The system provides a single data center supporting data collection, management, dissemination, and reporting functionality of clinical, monitoring, and evaluation data. It includes features such as response monitoring tables and other administrative functions to help grantees monitor the progress of their data collection. Specifically, this system provides for:
Online data collection, minimizing the need for specific software and enabling access to data and reporting from any computer with an Internet connection;
On-demand data downloads, giving centers access to their local data at any time, without making a special request;
Instant clinical reporting, providing centers with access to individual-level clinical reports immediately after submission of clinical data;
Center-level reporting, allowing centers to access summary-level data on their centers in real time, for various program monitoring and reporting purposes;
Program-level reporting, enabling SAMHSA or NCTSI evaluation team members access to instant feedback on the status of data collection and outcomes across the NCTSI;
Secure data transfer and storage, protecting the privacy of children being served by NCTSI centers.
NCTSI evaluation surveys and forms that are Web-based for this evaluation include:
Core Data Set (Web-based surveys and forms)
Online Performance Monitoring Report (OPMR) (Web-based Form)
Evidence-based Practice and Trauma-informed Service Change Survey (ETSC): (Web-based Survey)
Training Summary Form (TSF) (Web-based Form)
NCTSI National Reach Survey (Web-based Survey)
Sustainability Survey (Web-based Survey)
Approximately 90% of responses are expected to be submitted electronically. The electronic format facilitates data collection in a variety of ways. For example, centers will enter key program achievement data into custom Web-based forms developed for the OPMR and have access to real-time summaries based on the information entered. The Web forms will utilize skip patterns so that only relevant questions will appear based on responses entered by individual centers (e.g., only Category III centers will answer questions related to direct clinical services). In addition, validations will be coded directly into the Web forms themselves to improve the reliability of responses.
The reporting system will be flexible to provide a method for centers to enter information on an ongoing basis. The OPMR will prepopulate previously reported responses that can be edited, which will decrease burden by allowing centers to keep responses previously entered when there are no updates to report. In addition, key accomplishments will be captured quarterly. Information that changes often, such as clients served, will be updated quarterly. Some key indicators of accessibility, interagency planning and coordination, sustainability, and quality ratings of collaborations and workgroup participation will be asked of centers one time per year.
The use of Web-based surveys and forms decreases respondent burden, as compared to that required for alternative methods, such as a paper format, by allowing for direct transmission of the survey or form. In addition, the data entry and quality control mechanisms built into the Web-based format reduces errors that might otherwise require follow-up, thus reducing burden, as compared to that required for a hard-copy administration. As well, respondents can complete the survey at a time and location that is convenient for them.
All of the Web-based surveys associated with the evaluation recruit respondents to participate through an e-mail invitation. The e-mail process occurs in four stages: (1) an advance invitation to participate, (2) a formal invitation, which includes the Web site’s URL and unique user name and password, (3) a reminder to all respondents, and (4) a final targeted reminder to nonresponders and those who have only partially completed the survey.
To help monitor program improvements, the NCTSI Evaluator will also be able to provide Continuous Quality Improvement (CQI) reports in real time through the electronic data center. The CQI reports can combine, for example, clinical data reported by grantees in the Core Data Set, program-level data reported on the OPMR, and training data reported on TSF to provide a comprehensive view of grantee performance. Such reports will show key indicators of system- and clinical-level performance. The reports can include quarterly and cumulative scores, allowing for ongoing monitoring of program activities and assessment of overall efforts.
Finally, SAMHSA and its contractors strive to ensure that all Web-based solutions are fully compliant with Section 508 of the Rehabilitation Act. This includes ensuring that all posted documents are compliant or have a compliant alternative. The NCTSI Evaluator utilizes Adobe products that are capable of producing compliant PDF files per the SAMHSA recommended process. The NCTSI Evaluator has a thorough knowledge of Section 508 standards and employs accessibility specialists with experience in Section 508 compliance verification, including assessment with a variety of assistive technologies, including screen readers, screen magnifiers, and voice recognition software.
4. Efforts to Identify Duplication
This evaluation generates data that have not previously been collected, or have only minimally been collected in the field of child traumatic stress and/or collected only by the NCTSI cross-site evaluation in the past. This includes information on access to quality, trauma-informed care for trauma-exposed children and adolescents and disparities in access to care by demographic groups, including a comparison of access to care within and outside of the NCTSI; the process of developing, disseminating, and implementing evidence-based practices (EBPs) and trauma-informed services for trauma-exposed children and adolescents, and their families; the impact of NCTSI training activities on NCTSI centers and child-serving systems outside the NCTSI; the national impact of the NCTSI; and an assessment of the sustainability of grant activities after funding has ended. As well, the Core Data Set, which includes data on who receives trauma services, the types of services they receive, and the outcomes related to receipt of these services, are collected in a systematic manner that yields more extensive, detailed, and consistent information than has previously been obtained.
Existing research and data in the area of child trauma are not sufficient to address the questions posed in this evaluation. For questions related specifically to the functioning and impact of the NCTSI, the NCTSI evaluation has and will serve as a primary mechanism through which the NCTSI will be understood, improved, and sustained. While data have been collected on EBPs in general, very little data exist on the development and use of EBPs in treating child trauma, nor specifically on the role of the NCTSI in this area. Thus, this evaluation generates new data and will not be reproducing existing data.
5. Involvement of Small Entities
Most data for this evaluation are collected from service providers, administrators, and researchers affiliated with NCTSI centers, which are public or private agencies that receive funding from the Federal Government and for whom participation in the evaluation is considered to fall within their job responsibilities. Some data are collected from mental health and non–mental health service providers working outside of the NCTSI centers. While most of these data are collected from public agencies, some organizations and individuals providing services to the target population, such as community-based organizations, not-for-profit agencies, or private providers, may qualify as small entities, but not a significant impact. The information required is the minimum to meet the study objectives.
6. Consequences if Information Is Collected Less Frequently
Below is a summary of the consequences if the NCTSI Evaluation information is collected less frequently, organized by evaluation activities that are proposed to continue and expanded evaluation activities.
For the CDS, data are collected at baseline, every three months, and at end of treatment. Three-month intervals were selected in order to capture changes after initial entry into treatment and to monitor those changes closely throughout the course of treatment. It is important to assess trauma symptoms at shorter intervals due to the short term nature of treatments used with youth impacted by traumatic events. Although many children will experience significant improvement in the first 3 to 6 months of trauma-focused treatment, it is important to continue collecting outcomes data throughout the course of treatment to understand the maintenance of changes across time. Longer and less frequent data collection intervals would miss important changes that are likely to happen with children during their treatment episode or shortly thereafter. It is necessary to have multiple data collection points to effectively monitor these changes in clinical outcomes.
For the TSF, trainers complete this form for each training event. To understand the number of and types of trainings being conducted by NCTSI centers as well as the roles of participants reached by NCTSI training activities, it is necessary to have a record of each training event. Due to the wide range and variation in the number and types of trainings, any sampling approach would miss important details and yield an inaccurate picture of NCTSI activities. Complete information is needed, particularly because the TSF provides data to assist SAMHSA in reporting on the GPRA indicator designed to assess the number of service providers who receive trauma-focused training and increases in the number trained over time.
For the NCTSI National Reach Survey, it is proposed that the survey be administered to administrators and professionals who are members of state and national child serving organizations across various service sectors (mental health, primary care, child welfare, justice, and education) in alternating years of the NCTSI evaluation. Less frequent data collection would limit our ability to assess the impact, over time, of the NCTSI on trauma-informed care beyond the NCTSI.
The proposed frequency of data collection for the OPMR is quarterly, supplemented with a set of items collected once annually. The items requested quarterly, such as number of clients receiving direct clinical services, program accomplishments, and public awareness activities are program monitoring domains that are subject to change frequently. Capturing the information less frequently would yield results that are less reliable and potentially less complete than if the information is captured on a quarterly basis.
Some of the domains are subject to fewer changes and will likely remain static across the grant period. For example, service capacity and accessibility of services are potentially established ways of conducting business and not likely to change from one reporting period to the next. For this reason, questions falling under these domains will be requested annually.
Many project activities are ongoing, and are most appropriately reported as they occur rather than on a specific required reporting interval. For example, products development, collaboration with local partners, and collaboration with other NCTSI centers occurs on an on-going basis throughout the course of the grant period. For this reason, it is more appropriate to record these activities as they occur. Otherwise, there may be less accuracy in the data and a greater likelihood that important activities will be left out. Multiple data collection points are needed to maintain a current inventory of Network products and to examine how Network strategies and approaches develop over time.
The proposed frequency of data collection for the ETSC is years 1 and 3 of NCTSI centers’ funding. The initial data collection point in year 1 will provide baseline information regarding the extent to which these child-serving systems are trauma-informed and about implementation of EBPs. The followup with the same child-serving systems in year 3 will assess the long-term impact of NCTSI center activities on transforming these systems to become more trauma-informed. This information is critical to understanding the extent to which NCTSI activities impact child-serving systems on an organizational and individual practitioner level. If these data are collected less frequently, the NCTSI evaluation would not be able to assess the long-term impact of related NCTSI activities.
Also related to NCTSI training activities, the TSIS is completed by training participants at each training event and is used to identify and invite individuals to participate in the ESTC Survey, which is designed to assess training impact. Many training event participants will only attend one NCTSI-sponsored training event during the evaluation cycle. Thus, if participants are not routinely identified during each training event, many would be excluded from participating in the ETSC Survey. Training events often target different types of participants (clinicians, educators, first responders, etc.); considering that centers often plan training events spontaneously, a sampling approach risks missing a large number of one particular respondent type.
Data collection for the two Sustainability surveys (for funded centers and for affiliate centers) will occur annually. Respondents from funded centers have the option of completing a sustainability survey as they complete the requirements for the OPMR, while respondents from affiliate centers are simply offered the opportunity to participate in a survey. In the case of funded centers, it is necessary to collect the data annually to assess funded centers’ progress towards sustainability planning throughout their grant. Similarly, in the case of affiliate centers, it would be difficult to improve understanding of facilitators, barriers, and other factors related to sustainability—post-funding, over time—without at least annual data collection.
7. Consistency With Guidelines of 5 CFR 1320.5
The data collection fully complies with the requirements of 5 CFR 1320.5(d)(2).
8. Consultation Outside the Agency
Federal Register Notice
SAMHSA published a notice in the Federal Register on June 21, 2011 (Vol. 76, p. 36135) soliciting public comment on this study. SAMHSA received no comments on the planned data collection.
Consultation Outside of the Agency
Consultation on the design, instrumentation, data availability and products, and statistical aspects of the evaluation occurred throughout the development of the evaluation design process and throughout the evaluation. Over the years of the evaluation, consultations have been sought from the following:
The Federal government
Experts in collaboration
Experts in development, dissemination, and adoption
Experts in logic modeling
Experts in cultural competence
Family representatives
Family members (i.e., families receiving services in the NCTSI)
Network staff
Trauma experts
In addition, in the past year, SAMHSA convened a 21-member Evaluation Steering Committee (ESC) to review the current evaluation and make recommendations for redesign. The ESC, which will continue to function in an advisory capacity, represents a wide range of types of NCTSI program stakeholders who bring relevant perspectives and expertise to the redesign process (see below). For the redesign, the ESC provided recommendations regarding issues such as identifying evaluation priority areas; determining evaluation questions, measurable indicators, data sources and data collection/analysis approaches for each evaluation area; identifying related instrumentation and forms; developing solutions to common evaluation challenges to ensure successful implementation; and identifying strategies to attain objectives such as ensuring the cultural competence of the evaluation and including special populations. The ESC includes the following stakeholder groups:
Consumers/caregivers from NCTSI centers (both currently funded and alumni centers)
Representatives of NCTSI centers (both currently funded and alumni centers)
SAMHSA representative
ICF Macro representative
NCTSN Steering Committee representative
NCCTS representatives
Topical experts (e.g., clinical experts; cultural competence experts; social network experts; program evaluation experts; and experts in child welfare, juvenile justice or education issues, etc.)
These consultations serve several purposes: (1) to assess perspectives across stakeholders groups regarding evaluation priorities; (2) to ensure the rigor of the evaluation design, the proper implementation of the design, and the feasibility of implementation; and (3) to verify the general relevance of the data to be collected and their specific relevance to families and members of minority groups. The redesigned evaluation reflects attention to each of these goals and is the result of significant collaboration with and contribution from stakeholders within and beyond SAMSHA.
9. Payment to Respondents
As described in the statement, remuneration is not provided by the NCTSI Evaluation to respondents for the majority of evaluation components. Many of the respondents who will be providing data work in an NCTSN center and receive wages from the NCTSI grant, which is Federally funded. These respondents are not eligible to receive additional remuneration for participating in the evaluation. In response to feedback from evaluation stakeholders, remuneration is for data collection activities (surveys) targeted to professionals who are working in centers, agencies and organizations outside of the NCTSN.
Specifically, these surveys include:
NCTSI National Reach Survey
Evidence-based Practice and Trauma-informed Systems Change Survey (ETSC)
Sustainability Survey for Affiliate Centers
Respondents who work with centers, agencies and organizations outside of the NCTSI and who complete either the NCTSI National Reach Survey or the ETSC Survey will receive a $10 gift certificate to be used online. Respondents who participate in the Sustainability Survey for Affiliate Centers will be entered into a contest to receive a $50 Amazon gift card for their participation in the survey. There will be 5 awards granted during each administration of the survey. The amount and type of remuneration for these surveys was determined based on research suggesting that modest noncontingent cash incentives can significantly increase survey response rates among mental health professionals (Hawley, Cook, & Jensen-Doss, 2009). For example, VanGeest and Johnson (2011) found that nurses are more likely to complete surveys when offered small financial incentives, whereas nonmonetary incentives were much less effective. A 2001 study (VanGeest, Wynia, Cummins & Wilson) found that physicians were more likely to respond to monetary incentives and that response rates did not significantly increase with the size of the incentive. Taken together, these studies support the utility of small noncontingent monetary incentives. The research base for the differential effects of lottery incentives versus unconditional fixed incentives is less clear (Laguilles, Williams & Saunders, 2010). By using both strategies with similar incentive types (i.e., Amazon.com gift cards), this evaluation attempts to maximize response rates and, hopefully, ascertain the most effective approach with this population.
10. Assurance of Confidentiality
For all of the NCTSI Evaluation components, all reports and publications from these data include only group-level analyses that fully protect the privacy of individual participants, and no data have been or will be stored with identifying respondent information.
All Core Data Set data collection activities are managed at the local level. Each NCTSN center that participates in Core Data Set data collection submits the protocol and instruments to their local IRB. To assist centers with this process, the NCTSI Evaluation contractor provides caregiver consent form and youth assent and consent forms. These forms outline the purpose of the descriptive and clinical outcome study, expectations associated with respondents’ participation, risks and benefits of participating, compensation for participating, contact information of individuals working on the study, approaches to protect the information, rights regarding the decision to participate, and voluntary consent. The NCTSI Evaluation team also submits the entire Core Data Set protocol to its IRB, though receives an exemption as the Core Data Set involves secondary data collection of de-identified records.
. The staff members at each NCTSN centers are responsible for developing procedures to protect the privacy of all participants in the evaluation data collection, storage of data, and reporting of all information obtained through data collection activities. These procedures include limiting the number of individuals who have access to identifying information, using locked files to store hard-copy forms (if used), assigning unique code numbers to each participant to ensure anonymity, and implementing guidelines pertaining to data reporting and dissemination.
Data from caregivers and youth are collected through interviews by site staff. The content of some questions is sensitive in nature, and some participants may experience psychological or social distress during an interview. The NCTSI evaluation team provides guidance to local staff through procedures manuals and training to assist communities in establishing local interviewer training to address respondent distress and other circumstances that may arise during an interview. Local evaluators develop procedures appropriate to local requirements, including guidelines for referral to requested services, and report abuse, neglect, and harm to self or others according to local law.
Each grantee implements an active consent procedure that informs the participants of the purpose of the evaluation, describes what their participation entails, and addresses the maintenance of privacy as described above. Informed assent is obtained from participating older children and adolescents (ages 7–17). In addition, informed consent is obtained from adolescents who have reached the age of 18 at follow-up data collection. Written informed consent/assent is obtained from children and families at the point of entry into services. Given that some children targeted for study recruitment may be the victims of ongoing domestic violence and their treatment status may be unknown to the perpetrator, special considerations will be taken around the signing of physical consent forms (e.g., when the signing of a consent form leaves a “paper trail” that may put the child or other study recruit in harm’s way, verbal consent procedures can sometimes be approved and secured through local institutional review boards [IRBs]) and methods for contacting the family for follow-up data collection interviews (e.g., using alternative methods of contact or disguised interviewer identity). To further protect evaluation participants, all grantees are asked to obtain a Federal Certificate of Confidentiality, authorized by Section 301(d) of the Public Health Service Act in order to provide additional protection of the information about the participants from civil and criminal subpoena.
To further protect study participants, the NCTSI evaluator obtained a Federal Certificate of Confidentiality, authorized by Section 301(d) of the Public Health Service Act. This certificate provides additional protections of the data from civil and criminal subpoena. Additionally, the NCTSI evaluator conforms to all requirements of the Privacy Act of 1974, under the System of Records: Alcohol, Drug and Mental Health Epidemiological, and Biometric Research Data, U.S. Department of Health and Human Services (HHS), #09-30-0036; the most recent publication in the Federal Register occurred on January 19, 1999 (64 FR 2914). Client records at the sites are also covered under this Privacy Act System of Records. In addition, the NCTSI Evaluator obtained a Federalwide Assurance (FWA), which ensures compliance with U.S. Federal regulations for protection of human subjects in research including the Common Rule (Title 45 Code of Federal Regulations Part 46) and other regulations as applicable. The NCTSI evaluator also requests that all grantees obtain an FWA.
On the TSF, the form does not ask for the trainer’s name, but rather requests the identification of the sponsoring NCTSI center and for the role of the person filling out the form. Since the TSF is simply a documentation of the training, and does not ask for opinions, beliefs, or personal experiences, there is little threat of negative consequences if the trainer’s identity is deduced.
In the case of the NCTSI National Reach Survey, active consent is obtained at each wave of survey administration (survey is administered in alternate years of the NCTSI evaluation as a cross-sectional assessment of agencies’ policies and practices) from respondents who are most knowledgeable about their agencies’ polices/practice and relationships with other agencies in the service system (see consent form in Attachment J). Full contact information for respondents, including name, address, phone, and e-mail addresses, is assembled from the membership rosters of professional organizations representing mental health, health, child welfare, education, and juvenile justice agencies. The NCTSI evaluation establishes cooperative agreements with these professional organizations to access their roster information where possible. Respondents are recruited to participate through an e-mail invitation (Attachment K). All invitees will be informed that they could respond using either the web-based survey, a paper survey, or through telephone interviews. The standard procedures of sending an email announcement, a formal invitation, and two follow-up emails will be carried out with those respondents having email addresses. Respondents who do not have an email address will be sent hard copy invitations, and if they do not respond to that, will be re-contacted through two rounds of telephone follow-up. The formal invitation explains the survey, including the voluntary nature of survey completion, anonymity of responses, and the risks, benefits, and rights as respondents. This invitation also provides contact information if the survey recipient has questions or desires clarification prior to participation. The second page of the survey contains an informed consent form that asks the potential respondents to certify (by checking a space for “agree” or “do not agree”) that they have read the informed consent form, understand its content, and freely agree to participate in the project. Access to the NCTSI National Reach Survey is password protected, and the survey uses data encryption to further enhance security and protect privacy. For anonymity of responses, two databases are created for the survey: one stores the identifying information, including name, user ID, and password, and the other database stores the survey responses. The two databases are not linked after the data are collected. While data are being collected, only the system administrator has the key that links the two databases, and this key is destroyed when the data are transmitted to the evaluator. Respondents are asked to log in using an assigned ID and password that is provided in the formal invitation. After the respondent logs on to the survey, it is possible to check off that the subject responded to the survey in the identifier database.
Although the information collected through the OPMR is considered to be public information and there is no expectation of privacy, the financial information provided by grantees is considered sensitive and each grantee will be notified that this information will not be shared with anyone except their SAMHSA project officer. An online data collection and management system will facilitate routine data entry of OPMR data for center administrators. Each grantee will be provided a password and user ID to enter data at the community level. Grantee level reports will be made accessible only to the specific grantee and their SAMHSA project officer and all other reporting of the data will be in an aggregate format only.
In the case of the ETSC survey, respondents’ identities will be known and an active informed consent process will occur to ensure that participants’ rights are protected. Respondents are recruited to participate through an e-mail invitation (Attachment L). The formal invitation explains the survey, including the voluntary nature of survey completion, anonymity of responses, and the risks, benefits, and rights as respondents. This invitation also provides contact information if the survey recipient has questions or desires clarification prior to participation. The second page of the survey contains an informed consent form that asks the potential respondents to certify (by checking a space for “agree” or “do not agree”) that they have read the informed consent form, understand its content, and freely agree to participate in the project (Attachment M). Access to the survey is password protected, and the survey uses data encryption to further enhance security and protect privacy. For anonymity of responses, two databases are created for the survey: one stores the identifying information, including name, user ID, and password, and the other database stores the survey responses. The two databases are not linked after the data are collected. While data are being collected, only the system administrator has the key that links the two databases, and this key is destroyed when the data are transmitted to the NCTSI evaluator. Respondents are asked to log in using an assigned ID and password that is provided in the formal invitation. After the respondent logs on to the survey, it is possible to check off that the subject responded to the survey in the identifier database. If the individual does not have e-mail access, a packet will be sent by regular mail containing a cover letter, an informed consent form, a survey, and a return envelope. Contact information will be used to send incentives to respondents who complete the survey and to follow up with non-respondents. All contact information will be kept on a secured server and will only be accessible to key study personnel.
For the TSIS, this is simply an acknowledgment that the respondent is attending a training event. The form asks training participants to provide their name, organization for which they work, professional role, and email address, and to check a box if they are willing to be contacted in the future to be invited to participate in a survey in the future. The form does not ask participants to provide opinions, feedback, or other personal information.
To protect the rights and privacy of the respondents in the case of the Web-based Sustainability Survey—For Affiliate Centers, an active informed consent process occurs. An e-mail is sent to potential participants explaining the survey, including the voluntary nature of survey completion, privacy of responses, and the risks, benefits, and rights as respondents (Attachment N). The informed consent form (Attachment O) advises the recipient that they will be asked to indicate, by checking a box on the Internet Web survey, that they agree to participate in the study before they complete the survey. Information about the study and participant rights is presented in the Web survey prior to the check box indicating consent to participate. The e-mail and the Web survey also provides contact information if the survey recipient has questions or desires clarification prior to participation. Once study activities are concluded, the database containing contact information for respondents is destroyed, in keeping with IRB requirements. IRB will be obtained for this study and is renewed each year. Data from this study are used to assess characteristics and factors related to sustainability of infrastructure and service delivery and continuation of practices and programs during the life of the award and after the Federal funding cycle is completed.
In the case of the Sustainability Survey—For Funded Centers, data collection occurs using the OPMR. Center administrators will be able to access the OPMR electronically and can elect to participate in the Sustainability Survey through a Web link. A username and password will be provided to the respondent to access the OPMR.
11. Questions of a Sensitive Nature
Because this project concerns services to children who have experienced traumatic events and their families, it is necessary to ask questions that are potentially sensitive as part of the Core Data Set. However, only information that is central to the study is being sought. Questions address dimensions such as suicidality and other self-injurious behaviors, criminal activity, developmentally inappropriate sexual behaviors, negative feelings, and experience of specific types of traumatic events, such as physical/sexual/psychological maltreatment, natural disasters, or terrorism. The answers to these questions are used to understand who is being served by the NCTSI, to determine baseline status, and to measure changes in these areas experienced after receiving NCTSI services. The measures that contain the sensitive questions are from the Core Data Set and have been selected by, and used in, the Network prior to the evaluation or have been recommended by the Evaluation Steering Committee and/or NCTSI workgroups made up of center representatives including clinical and evaluation experts.
12. Estimates of Annualized Hour Burden
In accordance with the evaluation design, data collection for an estimated 62 NCTSI centers will span the 3 years covered by this revision. As described in Section A2.d, the number of centers for which burden is calculated is 62, which represents the number of currently active grantees (45 CTS centers and 17 TSA centers). It should be noted that this number is simply an estimate, as the number of centers active per year changes as older cohorts of grantees cycle out and new grants are awarded. For the first year of this approval, there will be 62 active centers. After the first year, in September 2011, the 15 grantees funded in 2007 will reach the end of their data collection. At that point, additional centers may be funded or funded again. Because of the variability and uncertainty in the number of funded centers in each year, the estimate of 62 centers is used. In addition, based on the data collection experience during the first five years of this evaluation, we estimate that only 75% of centers are eligible for participation in the Core Data Set, based on variation in programmatic focus across centers. As a result, burden estimates for this data collection activity are based on an estimate of 47 participating centers
Table 4 shows the burden associated with the NCTSI Evaluation for the 3 years of this revised evaluation, the period for which revisions to OMB clearance are being sought. Burden estimates presented in Table 4 are based on information supplied by various sources. For measures used as part of the previous evaluation (i.e., most of the CDS measures), average burden estimates are based on experiences implementing the CDS as part of the previous evaluation. For new measures added to the CDS, these measures have also been used in the field and information about length of time required to complete these surveys (e.g., developers’ reported burdens) has been used to create average burden estimates. Measures that are newly developed for this evaluation were piloted by the NCTSI Evaluator to determine average burden estimates.
TABLE 4
Estimate of Respondent Burden
Note: Total burden is annualized over the 3-year clearance period.
Instrument |
Number of Respondents |
Average Number of Responses per Respondent per year |
Total number of responses |
Hours per Response |
Total Annual Burden Hours |
Hourly Wage Rate ($) |
Total Cost per Year ($) |
|
||||||||||||||
Caregivers Served by NCTSI Centers |
|
|||||||||||||||||||||
Child Behavior Checklist 1.5-5/6-18 (CBCL 1.5-5/6-18) |
3,2431 |
32 |
9,729 |
0.33 |
3,211 |
10.603 |
34,032 |
|
||||||||||||||
Trauma Information/Detail Form |
3,243 |
32 |
9,729 |
0.22 |
2,140 |
10.60 |
22,688 |
|
||||||||||||||
Core Clinical Characteristics Form |
3,243 |
32 |
9,729 |
0.5 |
4,865 |
10.60 |
51,564 |
|
||||||||||||||
UCLA-PTSD Short Form (UCLA-PTSD) |
2,4654 |
32 |
7,394 |
0.17 |
1,257 |
10.60 |
13,324 |
|
||||||||||||||
Trauma Symptoms Checklist for Young Children (TSCYC) |
9085 |
32 |
2,724 |
0.33 |
899 |
10.60 |
9,529 |
|
||||||||||||||
Parenting Stress Index Short Form (PSI-SF) |
9736 |
32 |
2,919 |
0.08 |
234 |
10.60 |
2,475 |
|
||||||||||||||
Youth Served by NCTSI Centers |
|
|||||||||||||||||||||
Trauma Symptoms Checklist for Children-Abbreviated (TSCC-A) |
2,0437 |
32 |
6,129 |
0.33 |
2,023 |
7.258 |
14,664 |
|
||||||||||||||
Children’s Depression Inventory-2 Short (CDI-2S) |
7139 |
32 |
2,140 |
0.08 |
171 |
7.25 |
1,241 |
|
||||||||||||||
Global Appraisal of Individual Needs Modified Shore Screener (GAIN-MSS) |
1,33010 |
32 |
3,989 |
0.08 |
319 |
7.25 |
2,314 |
|
||||||||||||||
Funded NCTSI Center Project Directors or Other Administrators |
|
|||||||||||||||||||||
Online Performance Monitoring Report (OPMR) |
62 |
4 |
248 |
0.60 |
149 |
19.2511 |
2,864 |
|
||||||||||||||
Sustainability Survey for Currently—Funded Centers |
62 |
1 |
62 |
0.28 |
17 |
19.25 |
334 |
|
||||||||||||||
NCTSI and Non-NCTSI Administrators |
|
|||||||||||||||||||||
Evidence-based Practice (EBP) and Trauma Informed Systems Change Survey (ETSC)—Administrator Version |
18612 |
1 |
186 |
0.30 |
56 |
19.25 |
1074 |
|
||||||||||||||
NCTSI Trainers |
|
|||||||||||||||||||||
Training Summary Form |
12413 |
2 |
248 |
0.2 |
50 |
19.25 |
955 |
|
||||||||||||||
Service Providers Trained by NCTSI Centers |
|
|||||||||||||||||||||
Evidence-based Practice (EBP) and Trauma Informed Systems Change Survey (ETSC)—Provider Version |
49614 |
1 |
496 |
0.3 |
149 |
19.25 |
2,864 |
|
||||||||||||||
Training Participants |
|
|||||||||||||||||||||
Training Sign-In Sheet (TSIS) |
4,96015 |
2 |
9,920 |
.02 |
198 |
19.25 |
3,819 |
|
||||||||||||||
Mental Health and Non-Mental Health Professionals from State and National Child Serving Organizations |
|
|||||||||||||||||||||
NCTSI National Reach Survey |
4,000 |
1 |
4,000 |
0.5 |
2,000 |
19.25 |
38,500 |
|
||||||||||||||
Affiliate Center Administrators |
|
|||||||||||||||||||||
Sustainability Survey— Affiliate Centers |
45 |
1 |
45 |
.28 |
12 |
19.25 |
226 |
|
||||||||||||||
Total annual summary |
23,937 |
40 |
|
|
16,261 |
|
173,829 |
|
||||||||||||||
|
|
|
|
|
|
|
|
|
|
|
|
1. On average, 75 percent of centers participate in the Core Data Set (47 of 62 centers), with an average of 69 baseline visits per year.
2. On the basis of the children enrolled in the Core Data Set through December 31, 2010, the average number of follow-up assessments is 2, yielding an average of 3 assessments per child.
3. Assuming that most of the families participating in the evaluation sample fall at or below the 2010 HHS National Poverty Level (U.S. Department of Health and Human Services, 2008) of $22,050 (based on a family of four), the wage rate was estimated using the following formula: $22,050 (annual family income)/2,080 (hours worked per year)=$10.60 per hour.
4. On the basis of the children enrolled in the Core Data Set through September 30, 2010, approximately 76% of the children in the Core Data Set will be ages 7 and older.
5. On the basis of the children enrolled in the Core Data Set through September 30, 2010, approximately 28% of the children in the Core Data Set will be between the ages of 3 and 7.
6. On the basis of the children enrolled in the Core Data Set through September 30, 2010, approximately 60% of the children in the Core Data Set will be aged 12 and under. We estimate that approximately 50% of centers will use this optional instrument, leading to an estimate of 30% of children in the Core Data Set.
7. On the basis of the children enrolled in the Core Data Set through September 30, 2010, approximately 63% of the children in the Core Data Set will be between the ages of 8 and 16.
8. Based on the Federal minimum wage rate of $7.25 per hour.
9. On the basis of the children enrolled in the Core Data Set through September 30, 2010, approximately 44% of the children in the Core Data Set will between the ages of 7 and 18, and will have depression indicated as a potential problem at baseline. We estimate that approximately 50% of centers will use this optional instrument, leading to an estimate of 22% of children in the Core Data Set.
10. On the basis of the children enrolled in the Core Data Set through September 30, 2010, approximately 41% of the children in the Core Data Set will be aged 12 and older.
11. Assuming the average annual income across all types of staff/service providers/administrators is $40,000, the wage rate was estimated using the following formula: $40,000 (annual income)/2,080 (hours worked per year) =$19.25 per hour.
12. Respondents will be administrators from 62 currently funded NCTSI centers and administrators from two child serving systems that each NCTSI center trains.
13. Respondents will be center trainers or evaluation staff. On average, 5 Training Summary Forms may be completed by 124 trainers over the three years.
14. Respondents are NCTSI center employed clinicians and center trained providers. It is estimated that on average from the 62 centers, four center-employed clinicians and four center trained providers will take the survey three times.
15. It is expected that at least two trainers per center will provide five trainings over three years and on an average there will be twenty participants per training.
As indicated in Table 4 the average total annual burden for data collection is estimated at 16,261 hours. This estimate was derived by calculating the burden for each measure, dividing those numbers by 3 and summing.
13. Estimates of Annualized Cost Burden to Respondents
There are no startup, capital, and maintenance costs associated with data collection for respondents. Grantees are collecting the data for the Core Data Set as part of their normal operations, and they maintain this information for their own service planning, quality improvement, and reporting purposes. In addition, each grantee has been funded, as part of the overall cooperative agreement award, to participate in the NCTSI Evaluation, with up to 20% of the grant award available for evaluation efforts and data collection. Therefore, no cost burden is imposed on the grantee by this information collection effort. Other costs related to this effort, such as the cost of data collection for studies other than the Core Data Set, data analyses, and materials, are costs to the Federal Government.
14. Estimates of Annualized Costs to the Government
SAMHSA has planned and allocated resources for the management, processing, and use of the collected information in a manner that enhances its utility to agencies and the public. Including the Federal contribution to local grantee evaluation efforts, the contract with the NCTSI Evaluator, and Government staff to oversee the evaluation, the annualized cost to the Government is estimated at $4,074,965. These costs are described below.
Each grantee is expected to participate in relevant aspects of the NCTSI evaluation; an estimated 75% of centers (i.e., those providing direct clinical mental health services) will participate in the Core Data Set. Assuming (1) that 75% of centers or 47 will participate in the Core Data Set, (2) that each of these 47 centers will have 1.5 full-time equivalents (FTEs) dedicated to evaluation, (3) an average annual salary of $30,000 for evaluation staff, and (4) that the average Federal contribution will be 100%, the annual cost for implementing the NCTSI Evaluation at the grantee level is estimated at $2,115,000. These monies are included in the cooperative agreement awards.
A Federal contract was awarded to ICF Macro to coordinate the design development for and implementation of the revised NCTSI Evaluation. The NCTSI Evaluation contract provides for 1 base year of $1,808,537, with an option to renew for 2 more years. The estimated average annual cost of the contract is $1,920,965. Included in these costs are the expenses related to supporting the development of, implementing and monitoring the evaluation, including, but not limited to, the following activities: coordinating the establishment of an evaluation steering committee to solicit stakeholder feedback on the evaluation design, supporting the steering committee during the design development process, developing an evaluation design based on feedback and instrument package, providing intensive technical assistance and training to sites to support participating in the evaluation, travel to sites and to relevant meetings, and data analysis and dissemination activities.
It is estimated that SAMHSA will allocate 60% of an FTE each year for Government oversight of the evaluation. Assuming an annual salary of $65,000, these Government costs will be $39,000 per year.
15. Changes in Burden
The estimate of annual burden hours associated with the current 3-year OMB clearance period is 11,333. SAMHSA is requesting 16,261 annual hours for this submission, an increase of 4,928 annual burden hours. The increase in burden is due to major program changes that are described below:
The previous OMB statement included burden estimates for six CDS instruments and indicated that 33 NCTSI centers would participate in the CDS. Also the CDS methodology collects outcomes data from only a subset of the children (100 per center) receiving direct clinical services. As described in Section A.2.d, three instruments have been added to the Core Data Set to address existing gaps in knowledge. In addition, an estimated 75% of the currently funded 62 NCTSI centers (or 47 centers) will participate in the CDS as compared to the 33 centers that participated previously and the outcomes data collection through CDS will be expanded to include all children receiving direct clinical services, which increased the projected burden by 7,984 annual hours.
The addition of four new data collection activities, including the ETSC (administrator and provider versions), the OPMR, the TSIS, and the Sustainability Survey (for funded centers and affiliate centers) resulted in an additional 581 annual hours.
The continuation of the TSF does not impact overall burden as pilot testing suggests that the length of time needed to complete this form remains the same as estimated previously. However, the continuation of the National Impact Survey (as the NCTSI National Reach Survey) has resulted in a increase in burden hours by 1,200 as the target respondent group is from various child serving systems.
As described in Section A.2.d, in an effort to consolidate and streamline data collection and reporting requirements for grantees, SAMHSA is proposing to discontinue ten currently OMB-approved data collection activities, nine of which contributed to the previous burden estimate. These include the YSS-F, TIS, GAAS, AIFI, PDDS, Network Survey, CTPT, Case Study Interviews, and Workgroup/Taskforce Coordinator Interviews. This has resulted in a decrease in burden hours of 4,837.
Thus, although the revised NCTSI Evaluation represents a significant effort to focus the evaluation on key priorities and eliminate outdated elements, as a result of enhancements in the evaluation (e.g., the CDS measures added to address in existing gaps in knowledge) and the increase of 18 centers to the program, the current request results in an increase of annual burden hours.
16. Time Schedule, Publication, and Analysis Plans
Time Schedule
The time schedule for the evaluation is summarized in Table 5. A 3-year clearance is requested for this project.
TABLE 5
Time Schedule
Receive OMB approval for revised NCTSI evaluation |
6 months from the OMB submission date |
Continue data collection for centers funded in 2007, 2008, 2009 and 2010 |
Ongoing |
Process and analyze data |
Ongoing |
Complete data collection for centers funded in 2007 |
September 30, 2011 |
Complete data collection for centers funded in 2008 |
September 30, 2012 |
Complete data collection for centers funded in 2009 |
September 30, 2012 |
Complete data collection for centers funded in 2010 |
September 30, 2013 |
Publication Plan
Annual and final reports will be submitted to SAMHSA with anticipated subsequent dissemination to other interested parties, such as researchers, policymakers, and program administrators at the Federal, State, and local levels. Although not required under contract, it is also anticipated that results from this data collection will be published and disseminated in peer-reviewed publications. Examples of journals that may be considered as vehicles for publication include the following:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Data Analysis Plan
Analyses to be conducted in the case of each of the forms and instruments proposed to be included in the revised NCTSI Evaluation are described below.
Evaluation Continuation
For the CDS, the data analysis plan for this study was described in detail in the currently approved OMB Renewal Supporting Statement in the Descriptive and Clinical Outcomes Study section. The data analysis methodology will not change based on the proposed revision to the number of clients being administered the CDS measures or the addition of new instrumentation.
For the TSF, data will be analyzed using descriptive statistics, allowing comparisons and generalization. Correlations will be examined for relationships between training topics, audiences, and specific interventions or assessments that were the focus of trainings.
Data from the NCTSI National Reach Survey initially will be analyzed using descriptive statistics. The key items measuring the dependent variable (i.e., extent to which agencies use trauma-informed policies and practices) are rated by the respondent as a dichotomous value (Yes/No). These dichotomous values are totaled for each agency respondent to produce an index of “Use of Trauma-informed Policies and Practices.” Similarly, items measuring the major independent variable (i.e., whether information/knowledge from or collaboration with NCTSI centers contributed to the agencies’ policies or practices) are also assessed with dichotomous items. These dichotomous responses also are totaled for each agency respondent to produce an index of “Total Exposure to the NCTSI.” Data are aggregated at the service sector level (i.e., mental health, child welfare, education, juvenile justice) and at the State level. Separate analyses will be conducted for the two sets of respondents (i.e., mental health organizations and other service sectors). Descriptive and inferential statistics are used to compare scores on the index of “Use of Trauma-informed Policies and Practices” over time and as a function of characteristics of the responding organizations (i.e., private or public, major functions of organizations), service sector, State, and exposure to the NCTSI.
Evaluation Expansion
The currently approved OMB Renewal Supporting statement outlined the data analysis plan for the currently OMB-approved Network Collaboration and Product Development and Dissemination studies. These analyses will now be conducted using data collected on the OPMR. In addition, the more comprehensive nature of the OPMR will allow for quantitative and qualitative reporting on the aggregate and for each center. To facilitate reporting and summarization, some of the information that was previously reported in a descriptive manner was changed to closed-ended response categories to provide useful reports in areas where centers identify other NCTSI centers with whom they partner on key activities, interventions adopted or trained at their center, and service systems with whom they partner. Standardizing the categories of interventions enables reporting on the number of service providers trained and the number of clients served with a specific practice or intervention during the quarter and cumulatively.
The TSIS is not applicable, as this sign-in sheet will not be analyzed beyond compiling contact information for participants and their organizations.
Data gathered through the ETSC Survey will first be analyzed using descriptive statistics. To the extent possible, survey items will be tallied and scored. Assuming that survey items are scaled, internal consistency reliability analysis as well as analytic techniques to assess validity (i.e., confirmatory factor analysis) will be performed prior to further analysis. Descriptive and inferential statistics will then be used to compare survey responses as a function of child serving systems. The qualitative data from the survey will be transcribed into word documents and imported into ATLAS.ti, a qualitative software program that supports the coding process by facilitating the marking and subsequent search, retrieval, classification, and cross-classification of text. We will develop the initial list of coding categories based on the research questions and assign a set of deductive codes to each of the preliminary categories. Definitions, inclusion and exclusion criteria, and explicit guidance for applying codes will be developed. Once inter-rater reliability is established, codes will be applied to the interviews and data analysis will begin. Themes and responses that were posed repeatedly by respondents will be noted. In addition to the identification of themes, ATLAS.ti software also facilitates the comparison of themes and the identification of relationships between themes. Our team will use techniques from both theme and content analysis. This analytic process will allow us to determine thematic and content consistency, and variability within and across the child serving systems.
For the Sustainability Survey, the analysis plan includes both quantitative and qualitative components. Web survey data and OPMR data are aggregated and analyzed quantitatively and qualitatively. Analyses for the survey data will include content/thematic analysis of open-ended questions, and descriptive, univariate, and bivariate statistical analyses of quantitative data. The information provided on the OPMR will allow SAMHSA to monitor centers’ plans for sustainability and will be a source of data for sustainability as it pertains to evaluation activities and financial planning. The information collected from the survey will provide guidance to NCTSI centers about successful sustainability strategies and lessons learned.
17. Display of Expiration Date
All data collection instruments will display the expiration date of OMB approval.
18. Exceptions to Certification Statement
This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions. The certifications are included in this submission.
Statistical Methods
1. Respondent Universe and Sampling Methods
Below is a summary of the respondent universe and sampling methods for the NCTSI Evaluation, organized by evaluation activities that are proposed to continue and expanded evaluation activities.
Evaluation Continuation
Under the currently approved OMB clearance for the CDS, descriptive and clinical outcomes data are collected on all children who enter outpatient or inpatient trauma-related mental health services. A subset of these cases is targeted for subsequent 3-month follow-up intervals whether or not the client is still receiving services for up to 1 year. The revision requests that centers administer the 3-month follow-up assessments to all clients served rather than a subset of clients and to limit the collection to the period of time while the client is receiving treatment. Of the estimated 62 active centers during any given year, we have found that approximately 75% are eligible to participate in the CDS based on their grant-funded activities.
For the TSF, a sampling plan is not necessary, as we are attempting to document every training event provided by funded NCTSI centers. The respondents are trainers who provide trainings for NCTSI centers.
The NCTSI National Reach Survey will be administered to members of professional associations representing the mental health, child welfare, education, juvenile justice, and health care sectors. Before administering the NCTSI National Reach Survey, the OPMR for each NCTSI center will be reviewed to identify state-level organizations with which centers partner, and a list of these organizations will be compiled. NCTSI centers and the NCCTS will also be asked to identify National-level organizations from the various child serving sectors that should as part of the respondent group for this survey. The National- and state-level organizations selected will then be contacted and asked to identify potential respondents for this survey. An estimated 2,000 individuals will be surveyed. To maximize response rates for this Web-based survey, the NCTSI evaluation team is using a $10 incentive and a four-stage approach composed of an advance invitation, a formal individualized invitation, and two follow-up reminders. Additional strategies include offering respondents alternative ways of responding (i.e., via hard copy or telephone interview) and follow-up telephone contact with nonrespondents.
Evaluation Expansion
The OPMR will be completed as part of centers’ quarterly progress and annual reports by project directors and staff from each center. Because the data are collected through the NCTSI’s current required progress reporting process, a 100% response rate is expected.
For the TSIS, a sampling plan is not necessary, as we are attempting to document every training event provided by funded NCTSI centers. All training participants will be invited to complete the TSIS.
The ESTC Survey will be administered to two broad types of respondent groups (administrators and human service providers) to assess the impact of NCTSI training and other dissemination activities on the respondent groups, particularly the extent to which services have become evidence-based and trauma-informed as a result of the trainings or educational activities. These surveys will be conducted twice over the grant period of each NCTSI center. The criteria for being included and recruitment method varies for the two respondent groups:
Administrators: As part of the process of creating our allocation sample, the OPMR and other center data will be used to identify the activities undertaken and services provided by each center, including training activities and other collaborative activities involving child-serving agencies. The NCTSI centers will also be asked directly about such interactions and partnerships, and they will be asked to identify a contact person working within such agencies. The contact person will be contacted and informed about the purpose of these surveys and asked to identify a suitable administrator. Data from the previous evaluation suggest that NCTSI centers usually work with at least two service systems. Assuming two administrators per NCTSI center (n=62), there will be 126 administrators overall for each administration. In addition, respondents will include administrators from the 62 currently funded NCTSI centers; thus, collectively, respondents will total 189. The survey will be administered in years 1 and 3 of an NCTSI center’s funding.
Human Service Providers: All professionals from child-serving systems that are trained by NCTSI centers (i.e., generally service providers of various types—mental health, child welfare workers, teachers, health care, etc.), will be administered the provider version of the ETSC Survey at the end of each training and at 12- and 24-month followups to assess the self-perceived increase in knowledge and impact on behaviors, supervision, consultation, and organizational supports for the effective delivery of evidence-based trauma treatment and trauma-informed practices. Currently, there is not a consistently maintained data source tracking trainee contact information or the average number of individuals trained by provider type. In order to avoid the additional burden it would place on centers to collect trainee contact information, maintain records of trainees by provider type, and secure consent to contact forms from the trainees, this study component will use a self-identification process (i.e., the TSIS) to gather the information required to establish a sampling frame, if needed. Respondents are NCTSI center employed clinicians and center trained providers. It is estimated that on average, for each of the 62 centers, four center-employed clinicians and four center trained providers will participate in this survey, resulting in a total of 504 respondents.
Respondents for the Sustainability Survey consist of project directors and evaluators for currently funded centers and project directors for affiliate centers. All center administrators in these roles will be selected to participate in the studies. The inclusion criteria for the respondents will be all current evaluators and project directors from centers funded in 2008, 2009, and 2010. Affiliate participants will include all active NCTSI centers as defined by SAMHSA from the 2001, 2002, 2003 and 2005 cohorts to include project directors. The potential number of respondents from currently funded centers will be 2 participants from 62 centers or 126 respondents. The potential number of respondents from affiliate centers will be 1 respondent per 45 affiliate centers, or a maximum of 45 respondents. The numbers of respondents for both surveys will be sufficient to run statistical analyses for descriptive, bivariate, multivariate analyses and between and within group comparisons.
2. Information Collection Procedures
Evaluation Continuation
CDS data are collected by individuals at the center level who may include trained data collectors or clinicians. Each center receives intensive training from the NCTSI Evaluator to ensure standard collection of these data. Because respondents’ reading levels will vary depending on age and other factors, the instruments can be either self-administered or administered in interview format by center staff, depending on the needs of the client. For example:
The TSCC-A is administered to children between the ages of 8-16
The UCLA-PTSD is administered to children 7 years of age and older
The CDI-2S is administered to children ages 7-17
The rest of the measures for this study (the CBCL, the TSCYC, the PSI-SF, and the Core Clinical Characteristics Forms [Baseline Assessment Form, Follow-up Assessment Form, General Trauma Information Form, and Trauma Detail Form]) are administered to caregivers.
In the case of the TSF, when NCTSI center trainers conduct a training activity, they complete a TSF form and submit the data electronically. If the training audience and training topics are appropriate for the NCTSI evaluation, the trainer will also invite the training participants to complete a TSIS (sign-in sheet), which is also submitted to the NCTSI Evaluator.
The NCTSI National Reach Survey will be administered by the NCTSI Evaluator through the NCTSI Evaluator’s online data collection system (see Section A.3 for more detail).
Evaluation Expansion
Similar to the NCTSI National Reach Survey, the OPMR, the ESTC Survey, and the Sustainability Survey will be administered electronically through the NCTSI Evaluator’s online data collection system (see Section A.3 for more detail). The OPMR can be accessed at any time by center administrators and there is an expectation that information will be updated on a quarterly, annual or one-time basis depending on the type of information being submitted. The ESTC Survey and the Sustainability Survey will be administered electronically by the NCTSI Evaluator on different data collection schedules (outlined in the section above). The Sustainability Survey for Funded Centers is accessible through a Web link that appears in the OPMR while the Sustainability Survey for Affiliate Centers is simply a Web-based survey. Respondents from funded centers will be invited to participate through the OPMR, while affiliate respondents will be sent an email invitation to participate. Respondents who prefer to submit a paper copy of any of the Web-based surveys will be provided the option of doing so.
Table 6 summarizes the information collection procedures for the forms and surveys included in the NCTSI Evaluation.
Procedures for the Collection of Information
Measure |
Indicators |
Data Source(s) |
Method |
When Collected |
|
Core Clinical Characteristics (Baseline Assessment Form) |
|
Caregiver |
Interview |
At entry into services |
|
CBCL 1.5-5 and CBCL 6-18 (Achenbach, 2001; Achenbach & Rescorla, 2000) |
|
Caregiver |
Interview/self-administered |
At entry into services and every 3 months through end of treatment |
|
TSCYC (Briere, 2005) |
|
Caregiver to children aged 3 through 7 |
Interview/self-administered |
At entry into services and every 3 months through end of treatment |
|
PSI-SF (Abidin, 1995) |
|
Caregiver to children aged 12 and under |
Interview/self-administered |
At entry into services and every 3 months through end of treatment |
|
TSCC-A (Briere, 1996)—abbreviated for NCTSI |
|
Children aged 8-16 |
Interview/self-administered |
At entry into services and every 3 months through end of treatment |
|
UCLA-PTSD (Rodriguez, Steinberg, et al., 1999) |
|
Children aged 7 and older |
Interview/self-administered |
At entry into services and every 3 months through end of treatment |
|
CDI-2S (Kovacs, 1992) |
|
Children aged 7 through 17 |
Interview/self-administered |
At entry into services and every 3 months through end of treatment |
|
GAIN-MSS (Dennis, Chan, & Funk, 2006). |
|
Children aged 12 and older |
Interview/self-administered |
At entry into services and every 3 months through end of treatment |
|
Core Clinical Characteristics (Baseline Assessment Form), Core Clinical Characteristics (Follow-up Assessment Form) |
|
Caregiver |
Interview |
At entry into services and every 3 months through end of treatment |
|
Core Clinical Characteristics (General Trauma Information Form), Core Clinical Characteristics (Trauma Detail Form) |
|
Caregiver |
Interview |
At entry into services and every 3 months through end of treatment |
|
Measure |
Indicators |
Data Source(s) |
Method |
When Collected |
|
EBP and Trauma-informed Systems Change Survey—Administrator Version |
|
Administrators of NCTSI centers and other child-serving systems/agencies |
Survey – online, by telephone, or pencil & paper |
At baseline (year 1 of the NCTSI center funding) and follow up (year 3 of the NCTSI center funding) |
|
EBP and Trauma-informed Systems Change Survey—Provider Version |
|
Providers at NCTSI centers and other child-serving systems/agencies |
Survey – online, by telephone, or pencil & paper |
At the end of each training and at 12 and 24 month follow up |
|
NCTSI National Reach Survey |
|
Administrators of agency representatives in the mental health, child welfare, education, and juvenile justice sectors |
Web-based survey |
Alternating years of the NCTSI evaluation |
|
Training Summary Form |
|
Trainers |
Paper & pencil |
At completion of all training events |
|
Training Sign-In Sheet
|
Participants provide:
|
Participants at NCTSI-sponsored trainings. |
Paper & pencil |
At beginning of all training events |
|
Sustainability Survey for Affiliate Centers |
|
Project Director
|
Web-Based Survey |
Annually |
|
Sustainability Survey for Currently Funded Centers |
|
Project Director
Evaluator |
Web-Based Survey |
Annually- OPMR form |
|
Online Performance Monitoring Report (OPMR) |
|
Project director/staff |
Web-based Survey
|
Quarterly and as part of the combined fourth quarter/ annual report |
3. Methods to Maximize Response Rates
Local center staff members are responsible for collecting CDS data in their community. The NCTSI evaluator provides resources and technical assistance to aid local evaluators in maximizing response rates. This is done by providing the following: (1) a data collection procedures manual, (2) regional and individual site-level trainings, (3) evaluation workshops at annual national meetings, (4) one-on-one contact with NCTSI Evaluation liaisons, (5) regular teleconferences and site visits throughout the evaluation period, (6) forums for NCTSI Evaluator-facilitated discussions, (7) reading materials, and (8) additional guidance and information, as questions arise. In addition, the NCTSI Evaluator offers support related to participant tracking to ensure that local data collectors are aware when an interview is due for completion. The table below includes response rates anticipated for the CDS at each assessment. The number of initial assessments is an estimate using the average number of baseline assessments (69) applied to 47 centers. Youth who continue treatment after the initial assessment (59% in the most recent complete year), are used to determine the response rates for each follow-up assessment.
Estimated Annual Response Rates for CDS
|
Response Rate1 |
Number of Respondents |
Number of Baseline (Initial) Assessments2 |
|
3,243 |
Number of Clients continuing treatment3 |
59% |
1,913 |
First Follow-up |
47% |
899 |
Second Follow-up |
29% |
555 |
1Derived from the most recent complete year of CDS data collection (2010)
2Assumes 69 baseline assessments per year for 47 centers
3Used as the base for computing response rates for follow-up visits.
The NCTSI evaluator encourages centers to use the following strategies in their data collection process in order to increase response rate:
Administer the instruments to children and their caregivers at times of their choice and administering multiple instruments at one time to reduce the number of interviews.
Develop a close working relationship between the data collection staff and providers at each center to facilitate tracking.
When available, administer instruments in English or Spanish to meet the needs of diverse communities and remove language barriers in completing the surveys.
Provide English- and Spanish-speaking interviewers to assist with administration of instruments; for other languages, when possible, link in an online interpreter after the interview has been initiated.
Conduct follow-up and informational mailings throughout the study period to maintain contact with study participants.
Employ proven tracking techniques (e.g., request address corrections from the post office for forwarded mail, use CD-ROM listings of names and addresses, employ locator services to search for respondents).
Provide families and center staff with useful feedback on data obtained through the evaluation activities that will provide insight into the progress and treatments of children in their center and assist them in planning and service delivery.
Data collection for the other Web-based surveys and forms implemented as part of the NCTSI Evaluation will be managed by the NCTSI Evaluator. The NCTSI Evaluator assists centers in maximizing response rates by:
Providing a modest incentive payment to non-NCTSI survey respondents based on research suggesting that modest noncontingent cash incentives significantly increase survey response rates among mental health professionals (Hawley, Cook, & Jensen-Doss, 2009).
Providing in-depth and ongoing technical assistance and guidance to NCTSI centers to support participation in the evaluation in general and build capacity to utilize the data center and online reporting system provided by the evaluation.
Sharing, with center management and evaluators, nonidentifying site-specific data with preliminary evaluation results
Incorporating preliminary evaluation findings into technical assistance efforts with grantees
It is expected that the OPMR will have a 100% response rate because this data collection is integrated into the existing required quarterly and annual progress reporting system employed by the Network.
4. Tests of Procedures
Core Data Set
The CDS measures were selected through a participatory process involving two phases of development: 1) the original development phase in 2003-2004, which was coordinated by the NCCTS and involved input from funded centers through surveys, conferences, and other activities, as well as the piloting of instruments across the NCTSI and 2) a more recent review in 2010, which was coordinated by the NCTSI Evaluator and involved a review by the NCTSI Evaluation Steering Committee, particularly of additional measures to include that are relevant to specific subpopulations previously missed by the original CDS assessment. Many of these instruments have also been endorsed by NCTSI workgroups as important to include in the CDS. Substantial information supporting the reliability and validity of the CBCL, TSCC-A, TSCYC, the UCLA-PTSD, the PSI-SF, the GAIN-MSS, and the CDI-2S, is already available from the developers of these tools. The Core Clinical Characteristics Forms (Baseline Assessment Form, Follow-up Assessment Form, General Trauma Information Form, and Trauma Detail Form) were created by the NCCTS to assist with the clinical evaluation of children. These forms are not structured to be amenable to formal psychometric testing. All of the measures for the CDS are available in Spanish. Additional details regarding each of the standardized measures follow.
Child Behavior Checklist for Ages 1.5–5
The CBCL 1.5-5 is designed to provide a standardized measure of symptomatology for children ages 1.5–5. The CBCL 1.5-5 has been widely used in mental health services research as well as for clinical purposes. The checklist is a caregivers’ report of their child’s problems, disabilities, and strengths, as well as parental concerns about their child. Caregivers report on 99 problem items by indicating if statements describing children are not true, somewhat/sometimes true, or very/often true for their child. Caregivers are also asked three questions that allow them to describe problems, concerns, and strengths for their child. Achenbach (1991) has reported a variety of information regarding internal consistency, test-retest reliability, construct validity, and criterion-related validity. Good internal consistency was found for the internalizing, externalizing, and total problems scales (α≥.82). The CBCL demonstrated good test-retest reliability after 7 days (Pearson’s r at or above .87 for all scales). Moderate to strong correlation with the Connor Parent Questionnaire and the Quay-Peterson scale (Pearson’s r coefficients ranged from .59 to .88) suggested the construct validity of the CBCL. The CBCL was, for most items and scales, capable of discriminating between children referred to clinics for needed mental health services and those youth not referred (Achenbach, 1991). A variety of other studies also have shown good criterion-related or discriminant validity (e.g., Barkley, 1988; McConaughy, 1993).
The instrument has been nationally normed on a proportionally representative sample of children across income and racial/ethnic groups. (Please note that the race variable from the CBCL instrument is not used to score the instrument for the NCTSI evaluation. The race variable from the Core Clinical Characteristics Form is used. Please see Attachment B for more information regarding this.) Racial/ethnic differences in total and subscale scores of the CBCL disappeared when controlling for socioeconomic status, suggesting a lack of instrument bias related to racial/ethnic differences.
The CBCL provides two broadband scores (i.e., internalizing, externalizing), seven narrow-band scores (e.g., emotionally reactive, withdrawn, aggressive behavior), and a total problems score. Scales are based on ratings of 1,728 children and are normed on a national sample of 700 children. Hand- and computer-scored profiles are available. The scoring programs developed by the authors should be used to generate the scores. All grantees will be provided with a copy of the scoring program and accompanying manual, if they do not already have them. Sites will be able to contact their NCTSI Evaluation liaisons for more information.
Child Behavior Checklist for Ages 6-18
The CBCL 6-18, formerly CBCL 4-18, is designed to provide a standardized measure of symptomatology for children ages 6–18. This new version of the checklist has been “updated to incorporate new normative data, include new DSM-oriented scales, and to complement the new preschool forms” (Achenbach System of Empirically Based Assessment, 2008b). The CBCL 6-18 has been widely used in mental health services research as well as for clinical purposes. The checklist is a caregiver report of social competence and behavior and emotional problems among children and adolescents. It consists of 20 social competence items and 120 behavior problem items, which include 118 specific problems and 2 open-ended items for reporting additional problems. The social competence section collects information related to the child’s activities, social relations, and school performance. The competence items had not been collected as a part of the CDS in the past, though many grantees had opted to collect the data for local use. Going forward, the CDS will include these competence items as a measure of resilience, while additional resilience measures are being explored. The behavior problem section documents the presence of symptoms (e.g., argumentativeness, withdrawal, aggression). The CBCL 6-18 scores on a number of empirically derived factors (Achenbach System of Empirically Based Assessment, 2008b). Although it does not yield diagnoses, the CBCL assesses children’s symptoms on a continuum and provides two broadband (i.e., internalizing and externalizing) syndrome scores, eight cross-informant syndrome scores (e.g., attention problems, depressive mood, conduct problems), six DSM-oriented scales, and percentiles for three competence scales (activities, social, and school). A total problems score can also be generated.
Achenbach (1991) has reported a variety of information regarding internal consistency, test-retest reliability, construct validity, and criterion-related validity. Good internal consistency was found for the internalizing, externalizing, and total problems scales (α≥.82). The CBCL demonstrated good test-retest reliability after 7 days (Pearson’s r at or above .87 for all scales). Moderate to strong correlation with the Connor Parent Questionnaire and the Quay-Peterson scale (Pearson’s r coefficients ranged from .59 to .88) suggested the construct validity of the CBCL. The CBCL was, for most items and scales, capable of discriminating between children referred to clinics for needed mental health services and those youth not referred (Achenbach, 1991). A variety of other studies also have shown good criterion-related or discriminant validity (e.g., Barkley, 1988; McConaughy, 1993).
The instrument has
been nationally normed on a proportionally representative sample of
children across income and racial/ethnic groups, region, and
urban-rural residence. (Please note that the race variable from the
CBCL instrument is not used to score the instrument. The race
variable from the Core Clinical Characteristics Form is used. Please
see the Attachment B for more information regarding this.)
The CBCL 6-18 scoring profile provides raw scores, T scores, and
percentiles for three competence scales, total competence, eight
cross-informant syndromes, and internalizing, externalizing, and
total problems. The cross-informant syndromes scored are (1)
aggressive behavior,
(2) anxious/depressed, (3) attention
problems, (4) rule-breaking behavior, (5) social problems, (6)
somatic complaints, (7) thought problems, and (8) withdrawn
depressed. There are also six DSM-oriented scales, including (1)
affective problems, (2) anxiety problems, (3) somatic problems, (4)
attention deficit/hyperactivity problems, (5) oppositional defiant
problems, and
(6) conduct problems. In constructing the
DSM-oriented scales child psychiatrists and psychologists from 16
cultures rated the consistency of checklist items with DSM-IV
categories. Scales are derived from factor analyses of caregiver
ratings of 4,994 clinically referred children and are normed on 1,753
children ages 6–18. The scoring programs developed by the
authors should be used to generate the scores. All grantees will be
provided with a copy of the scoring program and accompanying manual,
if they do not already have them. Sites should contact their liaisons
for more information.
UCLA PTSD Index for DSM-IV
The UCLA-PTSD screens for exposure to traumatic events and for all DSM-IV PTSD symptoms in children who report traumatic stress experiences. The measure yields preliminary PTSD diagnostic information and is keyed to DSM-IV criteria. The UCLA-PTSD can be administered to caregivers; a self-report version of the instrument also exists (Rodriguez et al., 1999). The self-report version is included in the Core Data Set. The instructions and questions should be read aloud to children under the age of 12 or to youth with known reading comprehension difficulties. Children under the age of 7 are not required to complete the form. The UCLA-PTSD is administered at intake and every 3 months, up to 12 months, to all children and adolescents ages 7–18 who are enrolled in the outcome study.
Trauma Symptom Checklist for Children—Abbreviated
The TSCC-A evaluates acute and chronic posttraumatic stress symptoms in children’s responses to unspecified traumatic events across several symptom domains. The TSCC-A is a 44-item self-report measure in which the child indicates how often he/she experiences various thoughts, feelings, and behaviors. The measure provides a means of assessing stress symptoms that do not rise to the level of PTSD diagnosis.
The TSCC-A has been standardized on racially and economically diverse children in urban and suburban environments and normed on age and sex. The instrument yields two validity scales, six clinical scales (anxiety, depression, anger, posttraumatic stress, and two dissociation subscales), and eight critical items. The 10 items related to sexual issues are not included in the abbreviated version of the TSCC (Briere, 1996). The TSCC-A is administered at intake and every 3 months, up to 12 months, to all children ages 8–16 who are enrolled in the outcome study.
Trauma Symptom Checklist for Young Children
The TSCYC (Briere, 2005) was developed to be the first fully standardized and normed broadband trauma measure for children as young as 3 years of age. Tested by clinicians and researchers throughout North America, the TSCYC is a 90-item caretaker-report instrument with separate norms for males and females in three age groups: 3-4 years, 5-9 years, and 10-12 years. Caretakers rate each symptom on a 4-point scale according to how often the symptom has occurred in the previous month. Unlike most other caretaker-report measures, the TSCYC contains specific scales to ascertain the validity of caretaker reports (Response Level and Atypical Response) and provides norm-referenced data on the number of waking hours the caretaker spends with the child in the average week (0-1 hours to Over 60 hours).
The TSCYC contains eight Clinical scales: Anxiety, Depression, Anger/Aggression, Posttraumatic Stress-Intrusion, Posttraumatic Stress-Avoidance, Posttraumatic Stress-Arousal, Dissociation, and Sexual Concerns, as well as a summary posttraumatic stress scale (Posttraumatic Stress-Total). These scales provide a detailed evaluation of posttraumatic stress, as well as information on other symptoms found in many traumatized children. The PTSD Diagnosis Worksheet incorporates information from the TSCYC to assist the user in evaluating PTSD criteria in younger children and provides a possible PTSD diagnosis in children 5 years of age or older (sensitivity = .72, specificity = .75). The TSCYC is appropriate for English-speaking caretakers, including those who have a relatively low reading level (Flesch-Kincaid score = 6.8).
Parenting Stress Index Short Form
The Parenting Stress Index (PSI) (Abidin, 1995) is designed for the early identification of parenting and family characteristics that fail to promote normal development and functioning in children, children with behavioral and emotional problems, and parents who are at risk for dysfunctional parenting. It can be used with parents of children as young as one month. Although its primary focus is on the preschool child, the PSI can be used with parents whose children are 12 years of age or younger. The PSI Short Form (PSI-SF) is a direct derivative of the PSI full-length test. All 36 items on the Short Form are contained on the Long Form with identical wording and are written at a 5th-grade reading level, for parents of children 12 years and younger. The PSI-SF yields a Total Stress score from three scales: Parental Distress, Parent-Child Dysfunctional Interaction, and Difficult Child. Principal components factor analysis with a varimax rotation was conducted, and items were retained based on the criteria of having factor loadings >.4 on only 1 factor (although some exceptions were made to this criteria). The PSI-SF has been found to correlate with the Full-Length form: Total Stress and Total Stress=.94, Parental Distress and Parent Domain=.92, Difficult Child and Child Domain=.87.
Children’s Depression Inventory-2 Short
Modeled on the Beck Depression Inventory and designed for school-aged children and adolescents (ages 7-17 years), the CDI (Kovacs, 1992) is a self-report, symptom-oriented depression scale with a 1st-grade reading level. It has 27 items, each of which consists of three choices. The child or adolescent is instructed to select one sentence for each item that best describes him/her for the past 2 weeks. The CDI provides a Total score, as well as five empirically developed factor scales that have been normed according to gender and age: Negative Mood, Interpersonal Problems, Ineffectiveness, Anhedonia, and Negative Self-Esteem. The CDI is appropriate to use when factor scale scores are desired, a more complete description of the child's depressive symptoms is needed, or more extensive clinical information is required. The CDI can be used for clinical and research purposes. Because it assesses various areas of functioning, the CDI facilitates the multifaceted evaluation of the child or adolescent. Follow-up administrations can help in the evaluation of remediation programs or to measure treatment effectiveness. The normative sample used for scoring the CDI was divided into groups based on age (ages 7–11,12–17) and gender. The normative sample includes 1,266 public school students (592 boys, 674 girls), 23%of whom were African-American, American Indian or Hispanic in origin. Twenty percent of the children came from single-parent homes. The internal consistency coefficients range from .71 to .89 and the test-retest coefficients range from .74 to .83 (time interval two-three weeks).
For the Core Data Set, the CDI-2 Short Form will be used. The CDI-2S is an efficient screening measure that contains 12 items and takes about half the time of the full-length version to administer. The CDI-2S has excellent psychometric properties and yields a Total Score that is generally very comparable to the one produced by the full-length version.
Global Appraisal of Individual Needs (GAIN) Modified Short Screener – 5 minutes
The 5-minute GAIN-Short Screener (GAIN-SS) is designed primarily as a screener in general populations, ages 12 and older, to quickly and accurately identify clients who have 1 or more behavioral health disorders (e.g., internalizing or externalizing psychiatric disorders, substance use disorders, or crime/violence problems). It also serves as an easy-to-use quality assurance tool across diverse field-assessment systems for staff with minimal training or direct supervision, and serves as a periodic measure of change over time in behavioral health. For the Core Data Set, the substance abuse scale from the Short Screener will be used, in combination with several GAIN items on types of substances used, to make up the GAIN-MSS.
Dennis, Chan, and Funk (2006) found that for both adolescents and adults the 20-item total disorder screener (TDScr) and its 4 5-item sub-screeners (internalizing disorders, externalizing disorders, substance disorders, and crime/violence) have good internal consistency (alpha of .96 on the total screener), were highly correlated (r = .84 to .94) with the 123-item scales in the full GAIN-I, had excellent sensitivity (90% or more) for identifying people with a disorder, and excellent specificity (92% or more) for correctly ruling out people who did not have a disorder. A confirmatory factor analysis of the structure of the GAIN-SS shows that it is also consistent with the full GAIN model after allowing adolescent and adult path coefficients to vary and cross-loading paths between conduct disorder items with crime/violence items.
Other NCTSI Evaluation Forms and Surveys
The NCTSI National Reach Survey and the TSF have been implemented as part of the NCTSI cross-site evaluation in the past and thus, information has been gathered regarding the utility of these resources, the quality of the data collected and the need for revisions and reframing. With input and feedback from the NCTSI Evaluation Steering Committee, the survey and form were revised, pilot tested with NCTSI staff members and revised slightly again. Feedback from the pilot testers was used to estimate length of time on average required to complete the data collection in each case.
The OPMR, ETSC Survey, TSIS, and Sustainability Surveys were each newly developed for the revised NCTSI Evaluation based on the stakeholder feedback obtained through the steering committee consultation process. The two Sustainability surveys were developed with input from subcommittee members of the Evaluation Steering Committee (ESC). The subcommittee consisted of five members from currently funded and affiliate (formerly funded) NCTSN centers. The input from the committee included the following: 1) conceptualizing domains related to sustainability in the NCTSN and 2) developing questions related to the concept of sustainability for both funded and affiliate centers. Specifically, the ESC members provided guidance on the development of the questions related to the two primary domains of the survey: 1) infrastructure and 2) service delivery and the continuation of programs and practices. The ESC also contributed to reviewing items pertaining to each section of the survey to enhance the survey’s quality and ensure content validity. Specific recommendations included the following: 1) the separation of financial questions from the primary domains of the survey, 2) the development of separate questions pertaining to the influence of the NCTSI grant on the mission of the program or vice versa, and 3) a comprehensive analysis of the center’s background as it pertains to the center’s affiliation in the NCTSN. The evaluation team incorporated the committee feedback in the final versions of the surveys. Finally, as described in the statement and noted in the reviewer’s comments, the Sustainability surveys were pilot tested with center representatives to assess length of time needed to participate in the data collection and to conduct cognitive testing. This testing resulted in relatively minor modifications, such as adding some response categories to some items and simplification of instructions.
While the Sustainability Survey is entirely new and has been added in response to stakeholder requests, the ETSC incorporates prioritized elements of two currently OMB-approved data collection efforts (GAAS and AIFI). The OPMR incorporates elements of five currently OMB-approved cross-site evaluation instruments (PDDS, Network Survey, CTPT, GAAS, and AIFI). Highlights of such elements that have remained in the OPMR are described in Section A.2.d. These surveys represent a distillation of items that stakeholders identified as most important based on evaluation priorities, while outdated items from the previous evaluation have been eliminated. Following the development of these forms and surveys, each was pilot tested with center representatives to assess length of time needed to participate in the data collection and to conduct cognitive testing. This testing resulted in relatively minor modifications, such as adding some response categories to some items and simplification of instructions.
5. Statistical Consultants
The NCTSI Evaluator has full responsibility for the development of the overall statistical design and assumes oversight responsibility for data collection and analysis for the NCTSI Evaluation. Training, technical assistance, and monitoring of data collection will be provided by the NCTSI evaluator. The following individual is primarily responsible for overseeing data collection and analysis:
Christine Walrath, PhD
ICF Macro
116 John Street, Suite 800
New York, NY 10038
(212) 941-5555
The following individuals serve as statistical consultants to this project:
Megan Brooks, MA
ICF Macro
3 Corporate Square, Suite 370
Atlanta, GA 30329
(404) 321-3211
Donna S Condron, M.A.
ICF Macro
3 Corporate Square, Suite 370
Atlanta, GA 30329
(404) 321-3211
Yisong Geng, PhD
ICF Macro
3 Corporate Square, Suite 370
Atlanta, GA 30329
(404) 321-3211
Robert Stephens, MPH, PhD
ICF Macro
3 Corporate Square, Suite 370
Atlanta, GA 30329
(404) 321-3211
Bhuvana Sukumar, PhD
ICF Macro
3 Corporate Square, Suite 370
Atlanta, GA 30329
(404) 321-3211
(626) 457-6678
The following agency staff member is responsible for receiving and approving contract deliverables:
Maryann Robinson, R.N., M.S., M.A.
Project Officer
Center for Mental Health Services
Substance Abuse and Mental Health Services Administration
U.S. Department of Health and Human Services
1 Choke Cherry Road, Room 6-1148
Rockville, MD 20857
(240) 276-1883
maryann.robinson@samhsa.hhs.gov
Any questions related to the documents or the NCTSI evaluation should be directed to the following agency staff member:
Ken Curl, MSW, LCSW-C
Public Health Advisor
Center for Mental Health Services
Substance Abuse & Mental Health Services Administration
1 Choke Cherry Road, #6-1148
Rockville, MD 20857
(240) 276-1779
kenneth.curl@samhsa.hhs.gov
LIST OF ATTACHMENTS
Attachment A NCTSI Evaluation: Overview of Components and Instruments
Attachment B NICON Screen Shots
Attachment C Core Data Set
Attachment D Evidence-based Practice and Trauma-informed Services Change (ETSC) Survey
Attachment E Online Performance Monitoring Report (OPMR)
Attachment F NCTSI National Reach Survey
Attachment G Training Summary Form
Attachment H Training Sign-in Form
Attachment I Sustainability Survey
Attachment J NCTSI National Reach Survey: Informed consent form
Attachment K NCTSI National Reach Survey: Email invitation
Attachment L Evidence-based Practice and Trauma-informed Services Change (ETSC) Survey: Email invitation
Attachment M Evidence-based Practice and Trauma-informed Services Change (ETSC) Survey: Informed consent form
Attachment N Sustainability Survey: Email invitation
Attachment O Sustainability Survey: Informed consent form
Page
File Type | application/msword |
Author | natalie.j.henrich |
Last Modified By | bbarker |
File Modified | 2011-12-09 |
File Created | 2011-12-09 |