Program Evaluation for Assertive Adolescent & Family Treatment (AAFT) Program
Data Collection Plan
Submitted to:
Karl S. Maxwell, Task Order Officer
Center for Substance Abuse Treatment, SAMHSA
1 Choke Cherry Road Room 5-1055
Rockville, MD. 20857
Telephone: 240-276-2824
Fax: 240-276-2800
Karl.Maxwell@samhsa.hhs.gov
By:
Terri Tobin, Ph.D.
AAFT Evaluation Director
Advocates for Human Potential, Inc.
490-B Boston Post Road
Sudbury, MA 01776
Telephone: 978-261-1418
Fax: 978-443-4722
Task Order No. HHSS283200700038I/HHSS28342002T
Reference No. 283-07-3805
Table of Contents 2
I. Introduction 3
II. Evaluation Questions and Data Collection Overview 4
III. Data Collection and Processing 10
IV. Limitations of Data Collection and Evaluation Design 11
V. Quality Assurance 12
VI. Data Confidentiality and Protection of Human Subjects 12
Exhibits 14
A. Annual Program Survey
1. For Principal Investigators/Project Directors
2. For Clinical Supervisors/Clinicians
3. For Evaluators/Data Managers
B. Key Informant Interview
1. Protocol Guide
2. Informed Consent
3. Planned Respondents
C. Case Study Visits
1. Protocol Guide
2. Informed Consent
--Focus Group
--Community Stakeholders
--Staff
D. Direct Observation
1. Observation Guide
E. Document Review
1. Collection Coding Elements
The overarching objective of the multi-site, Assertive Adolescent and Family Treatment (AAFT) process and outcome evaluation is to assess and document the process of implementation in the 2009 cohort of AAFT grantees and to explore the role that implementation supports play in how these programs evolve.
While discussed in more detail below, in this evaluation, we aim to address the following process-related goals:
Describe the process of implementing AAFT.
Describe the implementation supports provided by CHS.
Identify barriers and supports to successful implementation.
Describe modifications to grantees’ plans.
The outcome-focused goals of the evaluation are to:
Document changes in agency-level structures, processes, and services across the grantees.
Determine the impact of the intervention supports on grantees.
Determine the impact of local implementation approaches on agencies.
Explore relationships between implementation and client-level outcomes.
To achieve these goals, our evaluation approach includes collecting both quantitative and qualitative data using a combination of primary and secondary methods—1) web surveys; 2) key informant interviews; 3) case study visits; 4) observation; 5) document review; and, 6) secondary data compilation and analysis. This Data Collection Plan specifies the methods to be used and includes details on their connection to the evaluation goals and questions, modes of collection, domains to be measured and sources, types of respondent groups and numbers, and time points for both collection and analysis. Copies of the related instruments/measures for implementing each are included as Attachments. The Plan concludes with a discussion of data processing, the limitations of our overall approach, procedures for quality assurance, and processes for ensuring confidentiality and participant protection.
The evaluation goals and primary questions are listed in Exhibit 1, at the end of this Plan. We also indicate which data collection methods will be used to address each of the goals and discuss those in detail. The section below and Exhibit 2 summarizes key details about each of the data collection methods, including the domains, measures, and sources of data; the respondent groups and numbers; and the schedule for collection, analysis, and reporting. Copies of the related instruments/measures are included as attachments.
Annual Program Survey
Purpose: Gather longitudinal data (end of each of 3 project years) from a range of grantee personnel concerning their implementation of AAFT.
Respondents: All 14 sites; 1) Principal Investigators/Program Directors; 2) Clinical Supervisors/ Clinicians; and, 3) Evaluators/Data Managers. The total number will vary based on staffing at each site.
Mode: Web (also available to respondents as Word form, if desired).
Domains:
Background/experience.
Degree of implementation of AAFT components; reactions to components.
Adaptations/modifications to the model.
Use/helpfulness of Chestnut/other implementation support services.
Readiness and perceived changes in clinical practice/behavior.
Perceived barriers (e.g. turnover) encountered in implementation and compensatory strategies.
Use of outcome and other data.
Efforts to plan for sustainability.
Schedule: Once approved by the Office of Management and Budget (OMB), the Surveys will be administered in October/November of each project year. Analyzed and reported on within 3 months of submission deadline (by end of February).
Because implementation is at root a change process that must ultimately take place in the attitudes, skills, practices, and behaviors of service providers and these elements cannot be assessed through document review or secondary data, we must directly measure these areas. We propose to use a Web-based survey (Attachment A) as the primary method for this data collection across all AAFT grantees, and to repeat this survey at the end of each project year to provide a developing portrait of sites’ implementation status and activities.
This tool has three versions, tailored to address the respondents’ roles in the grant (Principal Investigators/Program Directors, Clinical Supervisors/Clinicians, and Evaluators/Data Managers), and measures a range of domains (noted above and in Exhibit 2) using mostly close-ended questions, with some open-ended responses. While some data elements are from existing instruments (e.g., Organizational Readiness for Change, ORC), original question sets were also developed to capture other constructs such as reactions to Chestnut supports and perceived implementation of AAFT components.
Starting in November of each project year, we will ask respondents from all 14 sites to complete the appropriate survey version by November 30th —within two months of the end of each project year. Survey data will be analyzed and reported on within three months of the submission deadline (by the end of February). The surveys will be available to grantees in three formats—Web-based, electronic Word form, and paper versions. Once finalized and approved by OMB, the IT Team will design and manage an interactive, 508-compliant Web site that will allow grantees to submit survey information online.
Key Informant Interviews
Purpose: Provide assessments on implementation status, process, and progress from the perspective of the TA purveyor (Chestnut Health Systems (CHS) and related personnel).
Respondents: Interviewees include those providing technical assistance as part of the AAFT initiative. These include respondents who represent different project teams (e.g., ACRA/ACC implementation, GAIN Support, EBTx Support). Attachment B3 includes a list of planned respondents, n=13.
Mode: Telephone.
Domains:
Background/experience of the respondents and the AAFT initiative.
Core TA activities and supports.
Site characteristics and indicators of successful implementation.
Sustainability and implementing EBPs.
Schedule: Administered in March through April of each project year. Analyzed and reported on within 2 months of interview deadline (by end of June).
To document the provision of implementation supports from the provider, rather than the receiver side, the plan includes collecting data directly from CHS project leadership and front-line TA providers. Using a written protocol (Attachment B1, Key Informant Interview Protocol Guide), these interviews will gather respondents’ views on what it takes to provide implementation supports for an adolescent EBP, what aspects of their efforts are more or less successful, what barriers they encounter, and how they work with grantees to overcome them.
We will conduct these phone interviews annually, administering them in March through April of each project year, and analyzing and reporting on the findings within 2 months of the end of the interview deadline (by the end of each June). Respondents include project leaders and support providers representing a range of TA services—a list of approximately thirteen planned respondents are included in Attachment B3. To the extent possible, our plan includes interviewing the same set of personnel across project years to gain a richer understanding of how the project is unfolding over time, and how their attitudes and views have changed, with less noise due to having different respondents at different time points.
Interviews will be audio taped, transcribed, and analyzed using the ATLAS.ti qualitative analysis software. Responses will be coded for the type of personnel responding, the year of the survey, and the specific question of the protocol being responded to. Further substantive codes will emerge from the data and will be developed once the full corpus of information from a given survey year has been reviewed.
Case Study Visits
Purpose: Provide more detailed and finely nuanced data concerning all evaluation questions than can be acquired through the other, less cost-intensive techniques.
Respondents: Six sites will be selected for one case study visit; at each visit, individual and group meetings will be held with three classes of stakeholders: grantee agency staff, AAFT clients/families, and representatives from the community/network agencies.
Mode: In-person; individual and group interviews.
Domains:
Agency characteristics and project context.
Impact of AAFT grant.
Supports/challenges to implementation.
Intervention and implementation activities.
Use of and reactions to implementation supports.
Local evaluation activities and use of data.
Lessons learned and plans for sustainability.
Schedule: A 2-day visit, conducted once, in a 4-month window at the end of the 2nd project year/start of 3rd. Analyzed and reported on within 2 months of each visit.
To obtain detailed, first-hand information from a subset of the Grantees, we plan to conduct Case Study Visits at the end of year 2/beginning of year 3. These visits will allow us to interview individuals (e.g., program directors) and groups (e.g., adolescent clients and family members), observe specific activities at the site, and obtain reports and examples of products or services made available at the site.
Visits will be organized using a written Site Visit Protocol. Tailored versions by project role are included in Attachment C1. This document includes scripts for introducing sessions, and semi-structured questions and probes for the individual and group interviews. Informed Consent Forms are also included in Attachment C2.
Based on a review of secondary data and responses to the Program Survey described above, all grantee sites will be categorized by Fixsen’s stage of implementation of A-CRA/ACC/GAIN. Within this framework, six sites will be selected as representative of various stages, and will also be selected based on site-specific characteristics (e.g., urban/rural, outpatient/residential). Two sites from each subset will be asked to participate in a case study site visit to provide additional information about challenges/motivators to implementation, adaptations to the model, organizational changes, and experiences with data reporting requirements and support provided for the AAFT grant from Chestnut Health Systems.
Case Study Visit participants will include grantee agency staff (administrators, supervisors, clinicians, data managers), AAFT participants/family, community stakeholders and partners; site advisory committee members; local evaluators; and other specialty programs or key individuals involved or interested in improved services for adolescents with substance use disorders.
Each Visit will include two experienced site visitors, including our Expert Consultants, who will spend approximately 2 days at each grantee site. Visitors will review existing data to prepare for the visit and understand goals, activities, and challenges. This review will be completed before the case study visit and help to identify agencies and individuals who should be included and specific areas of inquiry. All individual and group interviews will be audio taped, transcribed, and analyzed using the ATLAS.ti qualitative analysis software. Analysis and reporting will occur within 2 months of each visit.
Direct Observation
Purpose: Observe key components of the initiative as they are occurring.
Events/Activities to be Observed: Key project components will be included and scheduled for observation as the project period progresses; includes such events as A-CRA/ACC Training, GAIN Training, Grantee Meetings, Implementation Calls, and Coaching Calls.
Mode: Combination; in-person (e.g., trainings) and by telephone (e.g., coaching calls).
Domains:
Characteristics of participants (individually and as a group).
Interactions.
Nonverbal behavior (learners, presenters).
Program leader(s), presenters.
Physical surroundings.
Products of the event.
Reactions/feedback to activity (observers’ and participants).
Schedule: Quarterly, with one key event (e.g., A-CRA/ACC training, GAIN training, Grantee Meeting) observed each quarter. Coded within one month of the observation; analyzed and reported on annually.
Collecting data by direct observation allows us to learn in detail how various project components work and the context within which they exist. It offers the opportunity to directly see what is done, hear what is said, and experience the social interactions that occur. It is an especially useful method for describing the key activities of this initiative, identifying the significant features, exploring the possible consequences of the activities, developing hypotheses about how and why it works, and identifying "side effects," or unintended consequences.
A range of events will be selected and scheduled across the project period with one event being observed during each quarter of the evaluation period. Events/activities will be observed in-person or by telephone and include A-CRA/ACC trainings, GAIN trainings, Grantee Meetings, coaching and implementation calls, and other AAFT activities.
We will collect data using an Observation Guide (Cloutier et al., 1987) (Attachment D) that lists the interactions, processes, and behaviors to be observed with space to record open-ended narrative data. The studied events will include multiple observers who are trained to keep detailed and concrete field notes, consisting of setting, people present, activities, and direct quotations. Observers will also record their own reactions to the experience and reflections about personal meaning and significance as well as insights, interpretations, beginning analyses, and working hypotheses about what is happening in the setting.
Field notes will be openly-recorded during the activities being observed, finalized within one week, and entered and coded in ATLAS.ti within one month of the observation. These will be analyzed and reported on annually.
Secondary Data Compilation & Analysis
Purpose: Gather client-level outcomes and key fidelity data in a cost-effective manner.
Respondents: Four datasets will be used; all include data collected at the client-level that will be analyzed at the grantee-level.
▪ GPRA data, submitted by grantees to the SAIS Web site.
▪ GAIN data, submitted by grantees to Chestnut.
▪ Treatment data, submitted by grantees to Chestnut via EBTx.org.
▪ Treatment Satisfaction Index data, submitted by grantees to Chestnut.
Mode: Collected in face-to-face interviews; retrieved/analyzed/reported on as secondary data.
Domains:
▪ GPRA/GAIN: psycho-social status, outcomes, General Continuing Care Adherence data.
▪ EBTx: A-CRA and ACC fidelity ratings; measures of usage of implementation supports.
▪ TxSI: early therapeutic alliance; adolescents’ satisfaction with services.
Schedule: Collection schedule is specified below. Retrieved from the dedicated FTP site quarterly, analyzed and reported on bi-annually.
▪ GPRA and GAIN: baseline, 3, 6, 12 months; also GPRA at discharge
▪ EBTx: ongoing throughout treatment
▪ TxSI: 2nd-5th session; 3, 6 and 12 months
The AAFT initiative is unique in that it provides a wealth of high quality data. Our plan includes making the maximum possible use of these data sources, proposing new data collection only where we believe existing resources cannot be used to address the evaluation questions. We have identified four main secondary data sources: GPRA data, GAIN data, TxSI data, and EBTx data. We plan to use GPRA and GAIN data to understand the characteristics of adolescents at intake and changes in their psycho-social status over time. These datasets will provide demographic/ background characteristics and standardized measures of outcomes such as substance use, risk behaviors, mental health status, and housing stability. We do not propose to analyze these data at the individual level, but rather to aggregate to the grantee level. Aggregated baseline measures will characterize each grantee’s population served, and aggregated change scores from baseline to follow-up points will provide broad measures of individual-level change. TxSI data, which is collected in concert with GAIN data, will provide important measures of adolescents’ satisfaction with services that again will be aggregated to the grantee level. Finally, the EBTx data will provide extremely important measures for this evaluation, since it contains session-level fidelity ratings by Chestnut staff and indicators of the extent to which clinicians are availing themselves of the provided implementation supports.
Project leaders from CSAT and Chestnut Health Systems have agreed to share this data, create an FTP site for the transmission, and post the data quarterly. Retrieved from this dedicated FTP site quarterly, it will be collection-coded (Exhibit 3) in our Share Point-based Data Repository and imported into SPSS for aggregation and analysis (bi-annually).
Document Review
Purpose: Use existing documents, produced as part of the program, to answer evaluation questions in a highly cost-effective manner.
Document Types: List Serve information, training materials, reports, meeting/call minutes, presentations, articles, program logs, local evaluation materials, and grant applications (assessment of types is ongoing throughout the project period and may be expanded).
Domains:
▪ Grantee agency background and contextual characteristics (size, structure, service sector).
▪ Implementation supports (e.g., frequency, participants, content, response).
▪ Training process.
▪ Certification/ratings.
▪ Compliance.
Schedule: On-going—coded monthly & analyzed bi-annually.
In the course of applying for funding and developing their projects, AAFT grantees will produce documents containing a wealth of information on their sites. Similarly, in delivering the technical assistance and implementation supports that are a part of this initiative, Chestnut Health Systems and CSAT will generate a range of documents throughout the project period. Such materials include training documents (e.g., manuals, agendas, participant feedback), reports (e.g., progress reports, compliance), agendas/minutes (e.g., monthly CSAT call minutes), presentations, local evaluation materials, and grant applications (e.g., initial proposals, continuation applications). Reviewing these documents, coding their contents, and synthesizing the information they contain will provide researchers an extremely cost-effective method for describing implementation and tracking progress over time.
CSAT and Chestnut Health Systems have agreed to share the range of documents produced as part of this effort. Retrieved on an on-going basis (mostly via email; some as hard copy), each document will be “collection-coded” in the Share Point Data Repository using a common set of elements (Exhibit 4). Each item will be tagged according to these codes and imported into ATLAS.ti for content coding, if appropriate. To ensure reliability, we will train on coding, recode a subsample of documents using different coders, cross-check results, resolve ambiguities, maintain a coding log of coding issues encountered and their resolution, and amend the protocols as necessary.
As detailed above, the evaluation plan calls for collecting and analyzing a wide array of primary and secondary data. Some of the data is quantitative, such as quarterly uploads of GAIN assessments and extracts from the EBTx system, while some is qualitative, such as case study visit notes and compliance reports generated by CHS. Exhibit 3 below shows the broad architecture for how we will handle both types of data and synthesize them for analysis. The incoming data streams are shown on the left side of the figure. All incoming data, of whatever electronic form, will be “collection coded,” or tagged with a small set of identifying metadata, and stored in one central data repository. Quantitative data will flow out of the repository through a series of SPSS programs to clean, code, and aggregate it into useful analysis datasets. Qualitative data will flow out of the repository for coding and analysis using ATLAS.ti software. Qualitative coding will result in site-level characteristics and values that will be imported and merged with aggregated participant data to create a unified site-level database. Qualitative and quantitative findings will be catalogued and merged in a single “knowledgebase,” allowing efficient querying, reporting, and examination of the full corpus of data.
Collection Coding and Document Management
All incoming data, of whatever type, will be catalogued and tagged with a set of fields that will allow for organized and efficient tracking of all the thousands of electronic documents we will generate and receive across the three years of the project. Furthermore, this metadata system will allow us to track the workflow associated with data so that we can instantly query, for example, how many CHS site reports need to be imported into ATLAS.ti, or whether there is a backlog of EBTx data awaiting processing in SPSS.
This system will be implemented in Microsoft SharePoint, which is ideally suited to manage large document collections in a shared work environment. All incoming data will be stored in a SharePoint “document repository.” As items are stored in the repository, they are associated with a record in a SharePoint list, which is, in effect, a simple web-based database allowing data entry, sorting, grouping, and filtering. The key fields we will use to track data are shown in Exhibit 4.
With the incoming materials classified by these metadata fields, we will be able to organize and track them, and be sure that all pieces of information are available for appropriate analysis. Additional fields in the metadata will concern the status of each piece of information, for example the date a qualitative document was coded in ATLAS.ti or the date a batch of GAIN data was appended to the aggregated GAIN dataset in SPSS. It is important to understand that this “collection coding” is distinct from the “content coding” of qualitative materials that will be conducted using ATLAS.ti software. That coding will feature a much richer set of codes to identify themes relevant to the evaluation questions. The content coding is only to catalogue, track, and organize the information.
While there are many strengths to the proposed Data Collection Plan and Evaluation Design (e.g., a strong conceptual framework, a multi-level design, a mixed-methods approach, a strategy for using the wealth of high-quality data available), it is not without its limitations.
Each data collection method proposed above has its advantages and disadvantages, outlined below. Limitations notwithstanding, we believe that the proposed data collection plan and evaluation design is strengthened by combining different kinds of data and multiple methods.
Data Collection Method |
Advantages |
Disadvantages |
Annual Program Survey |
• Relative ease/low cost of collecting data from large numbers of respondents in a short time. |
• Respondents vary in their computer literacy; Screen configurations may appear significantly different from one respondent to another. |
Key Informant Interviews |
•Opportunity to get insiders’ view. •Can provide in-depth information. •Allows you to clarify ideas/info on a regular basis. •Allows you to obtain information from many different people/ viewpoints. |
•Relationship with informant may influence information shared. •Informants give you their own impressions/biases. |
Case Study Visits |
•Provides in-depth information on sites. |
•Difficult to make definite cause-effect conclusions or to generalize from a single case; possible biases in data collection/ interpretation. |
Direct Observation |
•Collect data where/when activity is occurring. •Does not rely on people’s willingness/ability to provide information. •Allows you to directly see what people do. |
•Susceptible to observer bias and “hawthorne effect.” •Can ne expensive/time-consuming compared to other methods. •Does not increase understanding of why. |
Secondary Data Analysis |
•No cost of collection; fast. •Breadth of data available. •Benefit from the expertise/ professionalism of top scholars in the field and the quality data being collected.
|
•Analyses are limited by the data available. While the proposed datasets have a wide range of variables of interest, questions related to adolescents’/families’ perceptions of barriers to tx before/after this project, for example, are not included. •Data may be incomplete (e.g., EBTx data entered by clinicians). |
Document Review |
• Relatively inexpensive method for collecting a range of data • Good source of background information • Unobtrusive • Provides a behind-the-scenes look at a elements that are not directly observable |
• Could be biased because of selective survival of information • Information may be incomplete/ inaccurate • Can be time consuming to collect, review, and analyze many documents |
The Evaluation Director and Corporate Monitors have been chosen for technical expertise and their decision-making roles. They have demonstrated the ability to monitor staff work, ensuring it is done correctly and according to contract specifications. Dr. Tobin and her Team Leads will ensure project quality and adherence to the request for task order proposal’s (RFTOP’s) Quality Assurance Surveillance Plan. The project will be required to meet all AHP policies and procedures specific to data management, confidentiality, and project tracking. More specifically, our quality assurance procedures, as they relate to the evaluation design and data collection methods, include the following:
Annual Program Survey: the Web-based system is User ID/ password-protected & has many powerful features that reduce response time & errors (e.g., codes skip patterns, prohibits out-of-range values, defines/codes missing values).
Key Informant Interviews: interviewers will participate in a training will include a description of the AAFT Evaluation and purpose of the key informant interviews, a review of interviewer responsibilities/skills, & specific issues such as establishing rapport, standardization, neutrality, confidentiality & informed consent.
Case Study Visits: case study visitors will participate in a training that outlines key details about AAFT, the Evaluation, and the case study visits; preparing for the visits, using the case study protocol, and obtaining informed consent.
Direct Observation: multiple observers/coders will be used when possible; observers will be trained on taking field notes and coding them in Atlas.ti; field notes will be completed and submitted within one week of each observed event.
Team meetings will include status updates and discussions about interviewing, observation, document review and related difficulties.
As part of our quality assurance process, we document all aspects of project operations in our project tracking tools maintained on the SharePoint server.
While much of this project is likely to be exempt, it is still necessary to make application to an Institutional Review Board (IRB) for this purpose or to obtain permission to proceed with data collection employing appropriate procedures and forms to obtain voluntary consent. AHP has an IRB established under Office of Human Research Protections regulations that will be responsible for the relevant human subjects review of this project’s proposed methods, procedures, and protocols.
Training and ongoing supervision of the evaluation staff will cover data collection and storage procedures, with attention paid not only to accuracy and completeness in data collection but also to methods used to protect evaluation participants’ safety, privacy, and confidentiality. These procedures include:
All evaluation records will be stored in locked file cabinets.
All members of the evaluation team have completed the federal training on confidentiality and protection of human subjects and each will sign an AHP document pledging to protect the confidentiality of evaluation participants. They will not disclose either the identities or identifying characteristics of individual interviewees or any information disclosed during the interview, except to the supervising Evaluation Director.
All individual names or other identifying information will be removed from the interview and other records. A single identifying code will be substituted. A separate file linking this code, the individual name, and other identifying information will be maintained by the Evaluation Director. At the conclusion of the study, all copies of this file will be destroyed.
All physical records including the list of subjects’ names and code numbers, as well as the consent documents will be stored in a locked file with access restricted to the Evaluation Director.
Once the records are in electronic form, the paper copies will be destroyed, unless there are parts that are not entered into an automated file. Physical and electronic copies of interview data (including audio tapes) will be destroyed upon completion of the study.
Primary data collection participants will be asked to sign an informed consent form that stresses that participation is voluntary, that they do not have to respond to any questions they do not want to, and that answers will be reported in such a way that individuals can not be identified. More specifically, the Informed Consent Forms (Attachments B2 and C2) include the following kinds of information:
Goal of the evaluation and purpose of participation;
Type of participation (e.g., length of interview, types of questions);
Explanation of participants rights;
Risks and benefits of participation; and,
Confidentiality protections.
|
|
|
||||
Exhibit 1. Evaluation Goals and Selected Questions by Data Collection Methods |
Primary Data |
Secondary Data |
||||
Program Survey |
Key Informant Interviews |
Site Visits
|
Observation |
Secondary Data Compilation & Analysis |
Document Review |
|
Process Evaluation Goals |
||||||
Goal 1: Describe the process of implementing AAFT What activities are grantee agencies undertaking to install A-CRA/ AAC/GAIN? What is agency readiness level at startup? What are grantee leadership attitudes/expectations toward implementation? |
|
|
|
|
|
|
Goal 2: Describe implementation supports from CHS What is the range of supports provided? Do grantees vary in how they use supports? |
|
|
|
|
|
|
Goal 3: Identify barriers to implementation and strategies What organizational strategies are being used to overcome them? |
|
|
|
|
|
|
Goal 4: Describe modifications to sites’ plans How developed are grantees’ implementation plans? How have plans changed? Were changes internally/externally driven? |
|
|
|
|
|
|
Outcome Evaluation Goals |
||||||
Goal 5: Document changes in agencies’ structure and processes To what extent have agencies institutionalized components of AAFT? What is the level of fidelity of AAFT components and how has it changed? To what extent are sites using data to inform practice/how has this changed? |
|
|
|
|
|
|
Goal 6: Determine impact of intervention supports on agencies How do grantees view supports provided by CHS? Do grantees that more fully engage w/supports achieve better implementation? |
|
|
|
|
|
|
Goal 7: Determine impact of local implementation approaches on agencies: Are certain implementation activities associated with better implementation? What would grantees recommend to others undertaking similar work? What were the barriers to treatment for adolescents/families & how did they change? |
|
|
|
|
|
|
Goal 8: Explore relationships between implementation and client-level outcomes: Do sites that successfully implement and use data have better client outcomes; does improved implementation move in concert with client outcomes? |
|
|
|
|
|
|
Exhibit 2. Data Collection Methods
Data Collection Methods |
Respondents (description, approximate #) |
Mode of Collection |
Domains/Measures/ Data Sources |
Schedule—Collection/Analysis |
Primary data collection |
||||
|
PI/Program Director, 14 Clinical Supervisor/Clinicians N=56 |
Web (also offered in paper) |
|
Administered in Oct/Nov of each project year. Analyzed/reported on w/in 3 months of submission deadline. |
|
TA project leadership and providers N=13 |
Telephone |
|
Administered March-April each project year. Coded/summarized within 2 weeks of each interview. Analyzed/reported on within 2 months of interview deadline. |
|
Grantee agency staff, AAFT clients/families, representatives from the community/other agencies |
In-person |
|
Once, in a 4-month window at end of 2nd project year/start of 3rd – analyzed and reported on w/in 3 months of visit. |
|
Not applicable |
Combination of in-person & telephone |
|
Quarterly; one key component observed each quarter—coded by end of quarter; analyzed & reported by end of following month. |
Secondary data collection |
Respondents (Collection-level/Analysis-level) |
Domains/Measures/ Data Sources |
Schedule—Collection/Analysis |
|
|
||||
|
Client/Site |
|
Intake, 3, 6, 12 months—retrieved quarterly; analyzed bi-annually |
|
|
Client/Site |
|
Intake, 3, 6, 12 months—retrieved quarterly; analyzed bi-annually |
|
|
Clinician/Site |
|
On-going—analyzed bi-annually |
|
|
Client/Site |
|
2nd session; retrieved quarterly; analyzed bi-annually |
|
Document Review |
Not applicable |
|
On-going—coded monthly & analyzed bi-annually |
E xhibit 3.
File Type | application/msword |
File Title | Data Collection Plan |
Author | Terri Tobin |
Last Modified By | Terri Tobin |
File Modified | 2010-05-03 |
File Created | 2010-01-11 |