Office on Women’s Health
Evaluation
Methodology
National
Community Centers of Excellence
in
Women’s Health Program
National
Evaluation: Round II
Office
on Women’s Health Program
Support Center
Contract
No. HHS P23320045008XI Task
Order 020388
August
30, 2006
This
report is confidential and intended solely for the use and information
of the organization to which it is addressed.
TABLE OF CONTENTS
1.1 Purpose and Organization of Document 3
1.2 Background of the CCOE Program 3
1.3 Background of the CCOE Program Evaluation Initiative 3
2.3 Roles and Responsibilities 3
Section 3—Evaluation Methods 3
3.1 Data Collection Approach 3
3.3 Phase I: CCOE Director and Program Coordinator Survey 3
3.4 Phase II: Partner Survey 3
3.5 Phase III: CCOE Site Visits 3
3.6 Phase IV: CCOE Client Survey 3
3.7 Phase V: Final Report and Documentation Manual 3
Table 1.1 Relationship Between CCOE Goals and Core Components 3
Table 2.1 CCOE Program Evaluation Research Questions 3
Table 2.2 Roles and Responsibilities Matrix 3
Table A: Research Questions/Sub-Questions, Data Needed, and Data Source 3
Figures
1.1 Purpose and Organization of Document
The purpose of this document is to present the evaluation methodology for performing the Office on Women’s Health (OWH), National Community Centers of Excellence in Women’s Health (CCOE) Program: Round II. The National CCOE Evaluation: Round II mirrors the evaluation processes and utilizes the same data collection instruments as those developed by Booz Allen in conjunction with the OWH for the initial round of the National CCOE Evaluation (Round I), which was conducted in 2003. The evaluation methodology provides a comprehensive scheme for evaluating the CCOE program against the eight CCOE program goals established at the program’s inception, and assessing CCOE grantee progress towards goals since the baseline established during Round I.
The evaluation methodology takes into account Federal Program Assessment Rating Tool (PART) requirements and OWH strategic planning data collection needs, where both of these activities fit within the scope of the evaluation of the CCOE program as described in this document.
PART assessments focus on program effectiveness and on outcome or results-based metrics. For this evaluation, metrics will be collected that will provide evidence of the CCOE program’s ability to meet its program goals and achieve desired program outcomes, information which will directly inform PART requirements.
The OWH’s current strategic planning initiative includes proposed performance metrics against which OWH will be monitoring progress. Some of these performance metrics (ex: women who are aware of the early warning symptoms and signs of a heart attach and the importance of accessing rapid emergency care by dialing 911) can be informed by data produced by this evaluation effort, and where possible, this information will be provided to OWH to support their strategic planning efforts.
Section 1 of this document provides the background and purpose of the CCOE program and describes the goals and objectives of the National Evaluation: Round II. Section 2 provides an overview of the conceptual evaluation models that form the basis of this evaluation, the evaluation criteria, and research questions to be addressed, and the roles and the responsibilities of the evaluation stakeholders. Section 3 focuses on the evaluation methodology, including the approach for conducting the Round II evaluation, a detailed discussion of each of the data collection phases, as well as a discussion of the ethical considerations that will be taken into account throughout the evaluation. Section 4 highlights the timeframe for completion of the Round II evaluation.
1.2 Background of the CCOE Program
The CCOE program was established by the Department of Health and Human Services (DHHS) OWH in September 2000 as part of a national effort to eliminate health disparities due to age, gender, race/ethnicity, education, income, disability, rural location, or sexual orientation—one of the national health goals set forth in Healthy People 2010. Modeled after OWH’s National Centers of Excellence (CoE) in Women’s Health Program—a program to establish a model health care system to improve the health status of women across their life span and housed at academic centers across the country—the overarching goal of the CCOE program is to develop an integrated, innovative, community-based, interdisciplinary, and comprehensive healthcare delivery system that extends quality healthcare services to women of all ages and racial/ethnic groups. Specifically, the CCOE program seeks to integrate existing community health resources that support and impact women's health in six core component areas:
Comprehensive and integrated women’s health care services delivery,
Training for lay, allied health, and professional health care providers,
Community-based research,
Public education and outreach,
Leadership development for women as health care consumers and providers, and
Technical assistance (TA) to support the replication of successful models and strategies that coordinate and integrate women’s health services and improve health outcomes for underserved women.
The intent of the technical assistance component is to provide guidance to help other communities strengthen their efforts to improve women's health.
In the first year of the program, OWH selected three organizations as CCOEs—Mariposa Community Health Center in Nogales, Arizona; Northeast Missouri Health Council, Inc. in Kirksville, Missouri; and, St. Barnabas Hospital and Healthcare System in New York City (Bronx), New York. In 2001, the second year of the program, OWH selected four more organizations as CCOEs: Northeastern Vermont Area Health Education Center in St. Johnsbury, Vermont; NorthEast Ohio Neighborhood Health Services, Inc. in Cleveland, Ohio; Hennepin County Department of Primary Care in Minneapolis, Minnesota; and, Women’s Health Services in Santa Fe, New Mexico. OWH designated five organizations as CCOEs during Year 3 of the program: Christiana Care Health System in Wilmington, Delaware; Griffin Health Services Corporation in Derby, Connecticut; Jefferson Health System in Birmingham, Alabama; Kokua Kalihi Valley Comprehensive Family Services in Honolulu, Hawaii; and, Morton Plant Mease Health Care in Clearwater, Florida. After the fourth announcement of the program, OWH designated two organizations as CCOEs: Great Plains of Greeley County in Tribune, Kansas; and, Oakhurst Medical Centers in Stone Mountain, Georgia. A total of 14 CCOE programs have been funded to date. The CCOEs are awarded $150,000 annually over a project period not to exceed five years.
OWH created the National Centers of Excellence in Women’s Health (CoE)/National Community Centers of Excellence in Women's Health (CCOE) - Ambassadors for Change (AFC) Program for the pioneer CCOEs that “graduated” from the CCOE program (i.e., completed their five years as a CCOE). The primary purpose of the Ambassadors for Change Program is to serve as a mechanism for the organizations serving as CCOEs to retain their designations and continue offering services using the “one-stop shopping” CCOE model for women’s health care services that have been implemented by the CCOEs. In addition, the CoE and CCOE – AFC programs provides technical assistance to organizations interested in establishing women’s health programs.
The CCOE - AFC programs currently include St. Barnabas Hospital and Healthcare System, Northeast Missouri Health Council, Mariposa Community Health Center, Hennepin County Department of Primary Care, NorthEast Ohio Neighborhood Health Services, Northeastern Vermont Area Health Education Center, and Women’s Health Services. These seven CCOEs receive a smaller amount of funding for their AFC designation and participation in the AFC program compared to the CCOEs. Accordingly, their requirements are less than what is required of the CCOEs. The AFC responsibilities include having a clinical care center that has:
a women’s health clinical intake form,
referral and tracking system,
procedures for identifying and counting the women served by the program,
tracking the cost of services provided to women who receive interdisciplinary care through the program and
differentiating the services provided to women counted as CoE or CCOE patients compared to other patients
In addition to fulfilling these requirements, the CCOE -- AFCs also provide services in many of the CCOE core component areas. Due to their budgetary constraints and reduced funding, some of the services they were providing prior to their change in status may have been eliminated or may be performed at a reduce level of effort. This is a study limitation and is further explained in Section 3.9: Evaluation Limitations.
In 2001, OWH contracted with Booz Allen Hamilton to develop a comprehensive evaluation plan, methodology, and data collection instruments to use in conducting an evaluation that would measure the impact of the CCOE program on the delivery of healthcare to women in the CCOE communities. The primary purpose of the CCOE program evaluation was to assess to what extent the CCOE program was meeting the eight goals1 initially set forth by OWH when the CCOE program was first developed. The list below describes these eight initial program goals. Since the program’s inception, these goals have evolved and the language shifted so that Goal Two and Goal Seven have been consolidated into the remaining six goals, however the focus areas for the program as a whole remain the same.
Reduce the fragmentation of services and access barriers that women encounter using a framework that coordinates and integrates comprehensive health services with research, training, education, and leadership activities in the community to advance women’s health
Create healthier communities with a more integrated and coordinated women’s health delivery system targeted to underserved women2
Empower underserved women as health care consumers and decision-makers
Increase the women’s health knowledge base using community-based research that involves the community in identifying research areas that address the health needs and responds to issues of concern to underserved women
Increase the number of health professionals trained to work with underserved communities and increase their leadership and advocacy skills
Increase the number of young women who pursue health careers and also increase the leadership skills and opportunities for women in the community
Spread the successes, through technical assistance, of model women’s health program strategies and new innovations to communities across the country that may be interested in replicating the model3
Eliminate health disparities for women who are underserved due to age, gender, race/ethnicity, education, income, disability, living in rural localities, or sexual orientation.
The CCOE program is segmented into six core components that serve as the means for accomplishing the eight program goals. These six core components include:
Comprehensive and integrated women’s health care services delivery
Training for lay and professional health care providers
Community-based research
Public education and outreach
Leadership development for women as health care consumers and providers
Technical assistance to support the replication of successful models and strategies.
The six core components were “linked” to each of the eight program goals and served as a basis to measure the effectiveness of the CCOE program during Round I of the National Evaluation. Data was collected around each of these core components as a means to measure progress towards goals. Particular emphasis was placed on assessing the level of integration among the six components and within the health care services delivery component. Integrating community health resources is a cornerstone of the philosophy behind the creation of the CCOE program. The initial round of the National Evaluation also measured the extent to which other overarching program requirements were fulfilled, such as the requirement of having a physical clinical care center or facility.
Table 1.1 depicts the relationship between the goals and the components of the program. One or several of the core components of the CCOE program address each goal. Accordingly, by focusing evaluation efforts on the program components, Round I of the National Evaluation, which Booz Allen conducted in the Fall of 2003, assessed how well the CCOE program was meeting its goals.
Table 1.1 Relationship Between Original CCOE Goals and Core Components
CCOE Goals |
CCOE Components |
Reduce fragmentation and barriers, integrate comprehensive health services with other key components |
|
Create healthier communities |
|
Empower women as health care consumers and decision-makers |
|
Increase women’s health knowledge base through community-based research |
|
Train health professionals and increase their leadership and advocacy skills |
|
Increase health care career selection and leadership skills for women |
|
Spread success through technical assistance |
|
Eliminate health disparities for underserved women |
|
The National Evaluation: Round II will focus on assessing how well the CCOE program is meeting its goals relative to the baseline established during Round I. The intent of the initial evaluation design was to allow OWH to collect data at later points in time so that program outcomes and progress towards goals could be assessed longitudinally over time. By using the same methodology and similar data collection instruments as the previous evaluation, Round II will enable OWH to gauge the success and progress of the CCOEs.
The CCOE National Evaluation: Round II will utilize a quasi-experimental design, the same approach utilized during Round I. This design will include a time series component, which means that results will be assessed over time (i.e., Round I vs. Round II data). This method enables CCOE program implementation and outcome trends and patterns over time to be identified (e.g., increased integration, increased program participation, improved referrals, etc.). Round II will provide each CCOE with a current view of their performance towards program goals relative to their baseline status.
Round I of the National Evaluation measured the extent to which the CCOE program goals had been achieved by evaluating the extent to which the six core CCOE program components had been accomplished.
These components also provided a mechanism to measure how well the CCOE program was integrating existing community health resources that supported and affected women's health, a fundamental element of the CCOE model. Research questions were developed around each of the six core components. These research questions are included in Table 2.1.
Table 2.1 CCOE Program Evaluation Research Questions
Six Core Components |
(Core Component) Research Question |
Integrated Delivery of Women’s Health Care Services |
Has the CCOE program improved comprehensive health service delivery within the targeted communities? |
Training for Lay, Allied Health and Professional Health Care Providers |
How has the CCOE impacted the training of lay, allied health and professional health care providers within the targeted community? |
Community-Based Research |
What is the impact of the CCOE program on community-based research? |
Public Education and Outreach |
What is the impact of the CCOE program on public education and outreach? |
Leadership Development for Women as Health Care Consumers/Providers |
What is the impact of the CCOE program on leadership development among women? |
Technical Assistance/ Replication of CCOE Model |
Has the CCOE program replicated successful models and strategies? |
Program Requirements4 |
Overarching Program Requirements |
Next, research subquestions were developed to define how each core component would be measured. The research subquestions were developed based on the requirements associated with each core component and areas of interest to OWH, such as integration of the core components. For example, the level of integration between each core component was measured by developing a research subquestion for each core component.
F igure 2.1 CCOE Evaluation Framework |
A scoring process was used to assign a numerical score to each of the core component areas and research questions. These scores provide a way to quantify the extent to which the CCOE program is fulfilling each of the core components and, thus, program goals. In addition to calculating the quantitative scores, the evaluation team used a visually coded scheme to make results easy to interpret. Each core component received a rating of three, two, or one circle. Each of these ratings is associated with a score range with a total possible score of 100 points. The following guidelines are used to interpret the core component scores:
A ●●● rating indicates that the CCOE exceeded the goals or requirements for the core component. This rating corresponds to a score of 75 to 100.
A ●● rating indicates that the CCOE met the goals or requirements for the core component. This rating corresponds to a score of 51 to 74.
A ● rating indicates the CCOE only partially met or did not meet goals or requirements for the core component. This rating corresponds to a score of 50 or below.
The CCOE National Evaluation: Round II will utilize the same analytic framework and scoring methodology to assess the extent to which the CCOE program has progressed towards program goals, and improved outcomes have been achieved, relative to the baseline established during Round I. As with Round I, particular emphasis will be placed upon assessing the integration between each of the six components, as well as assessing the integration of health care services.
As mentioned in Section 1.2, the CCOE – AFCs may not be providing services in each of the core component areas due to their reduced funding and requirements. They will be assessed in each core component area in which they are providing services and not those where they are not providing services. Their scores in each component area will be based on the same scoring mechanism as described above.
Round I of the National Evaluation explored the option of weighting the scoring for the components based on their importance or on the emphasis OWH directed the CCOEs to place on their development. The Booz Allen Evaluation Team has decided against incorporating a weighting scheme for Round II. The rationale behind this decision is twofold:
The Evaluation Team wants to ensure that the data collected is comparable and is given the same weight (i.e., equal across all of the components) as was used for Round I.
The program guidance for the CCOE program has evolved over the years, sometimes emphasizing certain components over others, or changing the language around requirements and focus of the program components. Because this evolution has occurred over a number of years, and impacted each CCOE’s development and growth, attributing a weighting scheme now may give an unfair portrayal of some CCOEs over others, and may skew the assessment of progress over time in each component area.
In some cases, a research sub-question, such as whether the CCOE established a clinical care center or not, will have been fully addressed during Round I. Round II will validate these findings and utilize them in scoring the progress of the CCOEs towards goals to ensure that the CCOE scores from Round II can be compared against Round I scores.
Results from Round I will be used as a baseline and compared to Round II data. This comparison will enable OWH and the CCOEs to see growth towards program goals and the impact of the CCOE program designation and funding on program-related outcomes over time, relative to the baseline data collection of 2003. For the CCOE – AFCs, comparisons will only be made for the core components areas in which they are currently providing services. Round II will also enable OWH to collect baseline data for the two new CCOEs (Great Plains of Greeley County and Oakhurst Medical Centers), and will validate findings from Round I of the National Evaluation.
Each stakeholder involved in the CCOE National Evaluation Round II process has an important role. Defining these roles and the associated responsibilities ensures all stakeholders are aware of their obligations and commitments for making Round II a success.
A list of the key CCOE evaluation stakeholders and their roles and responsibilities for Round II is included in Table 2.2. These stakeholders will be referenced throughout Section 3— Evaluation Methods.
Table 2.2 Roles and Responsibilities Matrix
Stakeholder |
Roles & Responsibility |
Office
on Women’s Health |
|
CCOE Directors and Staff |
|
CCOE Community Partners |
|
CCOE Clients |
|
Evaluation Team |
|
Booz Allen is committed to providing high quality, value added service to its clients. Quality checks, such as spell checking and formatting, will be completed during each phase of the evaluation. As part of this commitment, the project director for the evaluation team will serve as the quality assurance (QA) advisor, inspecting all project deliverables to ensure that they meet both the client’s and Booz Allen’s quality standards. Our internal subject matter experts, made up of health care specialists, operations researchers, and program evaluation experts, will share their expertise by providing technical advice and quality reviews of work activities and products. Additionally, maintaining an active relationship with the OWH and the CCOEs will help identify potential problems that could affect the quality of the evaluation results.
The methodology for Round II is divided into five phases that include four main data collection initiatives and the final report development phase. This approach follows the same process that was undertaken for Round I of the National Evaluation. A different CCOE stakeholder group will provide information during each of the data collection phases. Each phase is essential to providing a comprehensive and accurate snapshot of CCOE activities and CCOE progress in meeting programmatic goals and demonstrating growth over time. In addition to the five phases, data gathered from the OWH Quarterly Progress Reports will be reviewed and integrated into the final evaluation results. A detailed description of each of the phases is included in Sections 3.2 through 3.6.
Figure 3.1 CCOE High-Level Evaluation Methodology
As the first data collection phase, Phase I consists of gathering updated descriptive data on the CCOE structure, partners, operations, and core activities directly from each CCOE. Data collected in this phase will be used to develop a basic understanding of the CCOE that the rest of the Phases will build upon. It also will be used to select respondents and determine sample sizes for the subsequent data collection efforts.
Phase II focuses on gathering information from partner organizations to update the comprehensive picture of the CCOE structure and activities. Surveying partner organizations will allow OWH to gain a better understanding of the community resources that are currently linked together to create the CCOE’s network of services. This data will also assist in understanding whether services and activities have become better integrated within each CCOE. The CCOE partners will provide additional information on the utilization of services and activities they provide on behalf of the CCOE, so that this information can also be compared against the baseline established in Round I.
Phase III consists of site visits to each of the fourteen CCOEs. Site visits will allow an independent and objective assessment of the CCOEs that will provide additional detail and a thorough appraisal of the CCOEs. The Booz Allen Evaluation Team will obtain copies of distributed literature, visit and interview a sample of partners, and observe day-to-day CCOE activities. Information gathered during the site visits will allow for further refinement, validation, and understanding of previously gathered survey data, and will be compared against the baseline established in Round I.
Phase IV focuses on gathering feedback from CCOE clients regarding CCOE services and activities. Data gathered from the client survey will provide critical information on clients’ perspectives on the benefit and accessibility of services and activities offered through the CCOEs, information that can be assessed against Round I data to show client-centric improvements.
Finally, the evaluation culminates in Phase V, when the findings from the data collection phases along with information from the CCOE Quarterly Progress Reports will be assimilated into a comprehensive report that will be presented to OWH by the Booz Allen Evaluation Team.
Each phase of the evaluation is described in further detail in Sections 3.3 through 3.7.
Any data collection instruments that will be administered to more than nine non-governmental employees require approval from the Office of Management and Budget (OMB). Federal Register notices and OMB clearance packages for each of the data collection instruments used in Phase I through Phase IV will be delivered to OWH for submittal to OMB. The Federal Register notices include a summary of the revised project and will allow public comment on the methodology and data collection instruments used. The OMB clearance package includes all forms, questionnaires, scripts, letters, and other materials that will be used in the data collection. Following a review of the OMB clearance package by OWH, the Booz Allen Team will meet with OWH to collect and discuss feedback and determine the appropriate steps for developing the final OMB clearance package.
The Round II evaluation will last six months and will begin once OMB approves the data collection instruments and full funding has been appropriated. Because Round II is a revision of the initial CCOE National Evaluation study, we anticipate OMB approval will be less than the standard time frame of 60-90 days.
During Round I of the CCOE National Evaluation, the Booz Allen Evaluation Team administered an internet-based survey to each CCOE’s Director and Program Coordinator. The internet-based survey provided many benefits for the evaluation effort. Internet-based surveys are a cost-effective mechanism for gathering information. Because responses were automatically entered into an electronic database, the internet-based surveys reduced the time and level of effort needed for the Booz Allen Evaluation Team to analyze responses and produce results. Round II will utilize updated internet-based surveys from the initial study and use the same methodology as was used in Round I, described below.
The Booz Allen Evaluation Team will send each CCOE an email invitation asking them to complete the survey. The email will include instructions on what data are needed and will contain a link to the Internet survey. The CCOEs will have approximately one month to gather the requested information, fill out, and submit the completed survey. Throughout this time, the Booz Allen Evaluation Team will be actively involved by providing survey-related support and monitoring survey response rates (or number of completed surveys).
For Round II, the CCOE Director and Program Coordinator survey will focus on gathering updated data on CCOE operations, structure, and partners. The data requested will be both qualitative and quantitative. Round II data will be compared to data from Round I, again to assess growth and progress towards program goals. Examples of topics to be included are:
Description of CCOE structure
Description of CCOE activities and services
Information related to the integration of women’s health care delivery
Training activities
Community-based research
Public education and outreach
Leadership development activities
List of partners, including description of type of services offered and contact information
Client and/or community demographics.
Please refer to the CCOE Director and Program Coordinator survey for more detailed information on the questions and data to be requested during this phase of the CCOE evaluation.
Once the CCOEs complete the survey, the Booz Allen Evaluation Team will analyze the survey data. Where appropriate, data gathered from the OWH Quarterly Progress Reports will be used to supplement survey data during data analysis. The data quality will be assessed first (i.e., checking for missing data, extreme outliers, and/or data that does not fit expectations). Next, key descriptive statistics and information will be compiled. Last, these statistics and data points will be assessed against information collected during the baseline from Round I. The results will be shown in trending tables and other charts or graphs that best display results. Qualitative data, such as descriptions of service offerings and activities, and changes in these activities over time, will be aggregated and displayed based on key themes. The reported findings will include aggregate CCOE results, as well as individual CCOE summary results. The data analysis and preliminary report preparation will take approximately one month to complete.
During Phase II, the Booz Allen Evaluation Team will administer a survey to the CCOE partners via electronic mail. In Phase I, the Evaluation Team will collect up-to-date contact information for each of the CCOEs’ current partners, and a description of their service offerings.
Due to the wide variety in the number of and type of services offered by partners at each CCOE, the sampling strategy for the partner survey will be developed in detail after a review of Phase I data. The sampling strategy chosen will be based on the CCOE’s current number and mix of partners to ensure a consistent, comprehensive, and accurate picture of partner activities is developed. One potential sampling strategy consists of sampling all partners for each CCOE. This methodology was used during Round I after a review of the volume and service-mix of CCOE partners. If the number or mix of partners are considerably higher than during Round I, the evaluation team will consider alternative sampling strategies. For example, the Evaluation Team could survey a stratified random sample of the partners that provide services in one of each of the six core CCOE components areas. Alternatively, another potential strategy includes sampling one partner organization for each service type offered.
Similar to Phase I, the CCOE Director and Program Coordinator Survey, the Booz Allen Evaluation Team will email a survey link to each partner chosen to participate (based on the sampling strategy). Detailed instructions on how to fill out the survey will be included. Prior to the survey administration, each CCOE will be responsible for communicating the importance of this effort to its partners. The partners will have approximately one month to gather requested information, fill out, and submit the completed survey. Throughout this time, the Booz Allen Evaluation Team will be available to answer questions as needed. The Booz Allen Evaluation Team will enlist CCOE leadership to be actively involved in encouraging partner participation in the evaluation and to follow up with their partners to ensure timely completion of the survey.
The partner survey will provide additional updated information on CCOE services and activities. The data requested will be both qualitative and quantitative and will include descriptions of partner activities and services, including client service utilization/participation statistics, as well as a description of communication channels and the level of integration with the CCOE and other partners, and perception of involvement in CCOE activities. Please refer to the CCOE Partner Survey for more detailed information on the questions and data to be requested during this phase of the CCOE evaluation.
The Booz Allen Evaluation Team will begin data analysis when all surveys are received or the pre-determined cut-off date is reached, whichever event occurs first. The data quality will first be assessed (i.e., checking for missing data, extreme outliers, and/or data that does not fit expectations). Next, key descriptive statistics and information will be compiled. Last, these statistics and data points will be assessed against information collected during the baseline from Round I. Where appropriate, results will be shown in trending tables (for baseline and current data) and other charts or graphs. Qualitative data, such as descriptions of partner services and activities, will be reviewed and aggregated based on key themes. If necessary, qualitative analysis software will be used. The reported findings will include: aggregate partner results, results by service type and/or by volume of CCOE clients (as appropriate and useful for understanding CCOE development), progress towards goals, and outcomes. The data analysis, interpretation, and preliminary report preparation will take approximately three weeks to one month to complete.
A site visit will be conducted at each CCOE. The Booz Allen Evaluation team will provide an independent and objective assessment of the CCOEs during the site visits. Information gathered during the site visits will offer valuable insight into the day-to-day operations of each CCOE and growth of the organizations since the Round I data collection.
Each site visit will take approximately two to three days depending on the location of the CCOE and its size. The Booz Allen Evaluation Team will send a two-person team to conduct each site visit. This ensures that there is always one individual available to record information during all interviews, discussions, and demonstrations that take place during the site visit. One of the team members conducting the site visit will be a certified clinician, public health expert, or other subject matter expert (SME) in the medical and/or public health fields. This expertise will allow Booz Allen to gain valuable insights into CCOE operations and provide a different expertise in analyzing CCOE operations. Additionally, utilizing an expert in data collection and interviewing will ensure that all topics of interest are addressed thoroughly during the site visit.
A site interview protocol will be used to conduct the site visits. Booz Allen Evaluation Team members will interview staff and gather documentation to gain clarification, validate, and obtain more information on data gathered from the CCOE Director and Program Coordinator and CCOE partner surveys as needed. They will obtain copies of distributed literature, visit and interview a sample of CCOE partners, attend and/or participate in any CCOE activities taking place, and gather informal feedback from clients. Key items that the Booz Allen Evaluation Team will review during the site visit include:
Record-keeping systems
Information technology infrastructure
Accessibility of CCOE facilities
Staff perspectives on quality and type of services offered, success factors, barriers, and key lessons learned
Changes and improvements over time.
Data gathered during site visits will be analyzed using a content theme analysis since most data will be qualitative. Therefore, similar themes found from each CCOE will be aggregated together. The data analysis and preliminary report preparation will take approximately three weeks to one month to complete. Please refer to the Site Visit Interview Protocol for more detailed information on the questions and data to be requested during this phase of the CCOE evaluation.
The CCOE evaluation includes a client survey that will be administered in person to CCOE clients at each CCOE’s clinical care facility. Only clients who have used a CCOE service(s) will be invited to participate. Furthermore, CCOE clients may only take the survey once. A trained staff member at each CCOE will administer the client survey for a time frame that may last up to five- to six-months depending on the sampling scheme developed for each site. The Booz Allen Evaluation Team will work with OWH and CCOE leadership to identify one individual per CCOE to train as the client survey administrator. The individual should work in the CCOE’s clinical care facility. He or she should also be fluent in any primary languages other than English spoken within the CCOE community. Client surveys will be provided in English and potentially in Spanish. Offering a translated survey will help increase the representation of this significant population group’s perspective in the evaluation results.
Collecting data in person or on site is an optimal data collection method when gathering feedback from transient or low-income populations. This is because response rates are maximized compared with other data collection methods. Despite this benefit, there is a drawback in that respondents may be prone to respond in a certain way based on their interaction with the survey administrator or the environment in which they are taking the survey. Providing training to the survey administrator will minimize this effect. The trained survey administrator will guide the client through the questionnaire and answer any questions as needed. This will be particularly useful for clients with low-literacy levels and in communities where languages other than English are commonly used. This methodology was effective during Round I of the National Evaluation, and will be utilized for Round II as well.
The survey administrator training will occur at the beginning of the evaluation to ensure that client surveying can begin as early as possible and continue for up to five to six months. The Booz Allen Evaluation Team will hold a one-hour conference call training session with the Survey Administrator at each CCOE. The training will provide instruction on:
Selecting a sample of clients to participate in the survey
Administering the survey, typical questions asked, and ensuring confidentiality
Maintaining records of survey respondents
Storing (in a secure locked location) and sending completed surveys to the Booz Allen Evaluation Team
The Booz Allen Evaluation Team will work with the CCOE and designated staff member to develop a random sampling strategy for selecting clients based on the CCOE’s estimated client volume (e.g., sample two clients twice a week for six months). Initial conversations to discuss how CCOE clients are tracked and the relationship that clients have with the CCOE will begin prior to the start of Round II. A random sampling strategy will then be finalized in the month prior to the start of the evaluation. The sampling methodology used at each CCOE will vary due to fluctuations in client volume and types of services used. Approximately 400 client survey responses per CCOE are recommended to analyze client findings and be able to make significant conclusions for each CCOE. However, aggregate findings for the CCOE program as a whole can be based on a smaller cohort if 400 clients per CCOE cannot be surveyed within the estimated five- to six-month maximum time frame.
The client survey will gather client feedback on their experiences with CCOE services and activities. This information will be compared to the baseline responses gathered in the initial evaluation. The survey is designed with an appropriate reading level and will take approximately fifteen minutes to complete. The survey includes mostly quantitative data with one to two open-ended (qualitative) questions. The data will help OWH and the CCOE Directors better understand current client perceptions of CCOE services provided. It will also provide the OWH with key data on the impact of the CCOE program in the community. Please refer to the CCOE Client Survey for more detailed information on the questions and data to be requested during this phase of the CCOE evaluation.
The Booz Allen Evaluation Team will maintain regular contact with the survey administrators at each CCOE to ensure that any questions and/or problems are addressed appropriately and in a timely manner, and to monitor survey response rates at each CCOE. The survey administrator will send completed client surveys to the Booz Allen Evaluation Team on a monthly basis so that the surveys can be reviewed for proper completion and the administration technique/guidance can be adjusted as necessary. A data entry subcontractor will enter the data into a database.
An analysis of the Round II survey data will begin after data collection is complete. Survey results will be analyzed using the statistical software SAS and compared to Round I data. Statistical analysis will be conducted for all quantitative data, such as frequencies, descriptives, cross-tabulations, and significance testing (i.e., Analysis of Variance (ANOVA) or T-test), if appropriate. Qualitative data, or open-ended questions, will be reviewed and aggregated based on key themes. The reported findings will include aggregate CCOE results, as well as individual CCOE summary results. The data analysis and preliminary report preparation will take approximately three weeks to one month to complete.
During Phase V, the results of the evaluation will be aggregated into a final report and submitted to OWH. OWH Quarterly Progress Reports will also be used in the data analysis and will contribute to the findings in the final report. Quarterly Progress Reports will be linked with data gathered in each of the four data collection instruments and analyzed. Both individual CCOE findings and aggregate findings on the CCOE program as a whole will be reported and interpreted based on the analysis plan. Findings related to both the current activities of the CCOE, as well as improved outcomes, growth, and progress over time, will also be documented. This analysis will provide OWH with a picture of the evolution of the CCOEs and the CCOE model over time.
The Booz Allen Evaluation Team will work with OWH to develop the structure of the final report to help OWH meet their reporting needs to their internal leadership, Congress, the press, and other external entities. Additionally, the report will take into account OMB PART requirements and OWH strategic planning data needs. The Booz Allen Evaluation Team will also work with OWH to ensure that the final report will facilitate the development of best practices and lessons learned that the CCOEs can utilize to improve their implementation of the CCOE model in the years to come. It will also address strategies for sustainability and effective transition to CCOE – AFC status.
An important facet of data collection is ensuring the confidentiality of the data collected. The Booz Allen Evaluation Team will maintain secure files of all collected information, notes, and analysis results (both electronic and hardcopy) to make certain no client, provider, or CCOE-specific content is externally available. All CCOE survey administrators will be trained in proper record keeping and confidentiality procedures.
Each data collection instrument includes a confidentiality statement to assure the CCOEs, their partners, and clients understand that the information they are reporting will remain confidential. Each instrument assures that the information the respondent provides will be anonymous and that no names will be associated with the survey or the responses. This message will be reinforced in all communications with the CCOEs as appropriate.
While individual CCOE-specific content will be available to OWH staff to facilitate improvement of the CCOE program, only program-wide findings will be reported externally. Quantitative data will be aggregated and qualitative data will be “cleaned” so as to prevent the disclosure of information that could identify individuals or personal information. However, the name of a CCOE program may be identified in the final report so that best practices can be attributed to them—only positive attributions will be made.
As with any study, there are several limitations to this evaluation that may affect the quality of the results. Each CCOE is working from limited resources and many of them do not have substantive infrastructure upon which to develop their own internal data collection efforts. To mitigate this issue, the Evaluation Team will work with OWH to designate a staff member at the CCOE to respond to or facilitate our data collection requests. Every attempt will be made to establish this relationship early on and organize our efforts without unduly burdening this point of contact.
Another limitation of this evaluation will be the use of client data. While this information provides an external (non-CCOE or partner) perspective on CCOE services, the results may be limited due to limitations on the client’s understanding of what they are rating. For example, a client may rate a non-CCOE service because she does not realize that the service is given outside the CCOE. The CCOE client should have a solid understanding of what service(s) is provided by the CCOE and their partners to provide meaningful feedback. This issue will be mitigated through the survey administrator training, which will incorporate lessons learned from Round I and address how the administrators should assist patients if they have questions filling out the survey without influencing responses.
The study will also be limited by the use of self-reported survey data. There may be a tendency to over-report results, such as utilization rates or participation rates. This would allow for a misrepresentation of performance, reflecting stronger performance than is accurate. For this reason, it is critical to include both the client survey and the site visits in the evaluation. Both of these activities will provide external viewpoints regarding CCOE activities. The feedback from CCOE clients will provide additional viewpoints or perspectives on how integrated the CCOE program is, how easy it is to access, and if they perceive the CCOE are impacting their community. Feedback gathered by the Booz Allen Evaluation Team during the site visits will also provide an additional independent and complete assessment of CCOE activities.
Another limitation of this study is the change in status of three of the CCOEs (St. Barnabas Hospital and Healthcare System, Northeast Missouri Health Council and Mariposa Community Health Center) to CCOE – AFCs. Because these organizations may not provide a full range of services in each of the core component areas compared to when they had CCOE status, this may limit the Evaluation Team’s ability to draw conclusions regarding these components. To the extent possible, the Evaluation Team will work with the three CCOE – AFCs to understand their service offerings and leverage their lessons learned in implementing services for these core components, sustaining them, and transitioning to external funding sources.
Lastly, another potential limitation is that OWH guidance and guidelines for designation as a CCOE may have evolved and changed over the course of the program’s five-year existence. Newly designated CCOEs may have received guidance that emphasized certain program components or goals over others. This evolution and requirements of the CCOE program, a natural facet of the implementation of any service offering, will require that the Evaluation Team be cognizant of some of the reasons why the current data collected during Round II may display greater progress in some component areas versus others, and include this analysis in their discussion of the progress the CCOEs have made since Round I.
The CCOE program evaluation methodology is founded on the assumptions listed below, which are based on information provided by OWH, the CCOE Program Evaluation Statement of Work, and the Federal Register Notice of the CCOE program. Additionally, OWH articulated several requirements for the CCOE program evaluation. They are included here as part of our list of assumptions upon which the program evaluation will be founded.
The implementation of the CCOE National Evaluation, Round II is contingent upon the availability of funds and the receipt of OMB clearance.
The CCOE evaluation methodology must examine the CCOE program as a whole, as well as independently examine each of the six program components. Additionally it must measure the integration of the six core program components:
Comprehensive and integrated women’s health services delivery (emphasis will be placed on assessing the level of integration within this component)
Training for lay and professional health care providers
Community-based research
Public education and outreach
Leadership development for women as health care consumers and providers
Technical assistance to support the replication of successful models and strategies.
The CCOE program evaluation must be both qualitative and quantitative in design.
The CCOE evaluation must include one client survey and one partner survey.
CCOE partners are required to report data needed to support the program evaluation.
The CCOE program evaluation must safeguard confidentiality of CCOE data, with specific regard to external reports.
OWH will share copies of CCOE Quarterly Progress Reports and final reports with the Booz Allen Team for the purpose of assessing data sources and data collection methodologies.
The deliverable schedule will be updated and finalized pending funding appropriation and OMB clearance.
The Booz Allen Evaluation Team assembled a list of research questions
during Round I of the National Evaluation in order to direct
evaluation efforts and drive the development of the data collection
instruments to be used in the CCOE evaluation. As described in
Section 2, each of these research questions is framed around one of
the core program components. Sub-questions for each of the main
research questions address topic areas of interest to OWH that will
provide useful information regarding the CCOEs, and in some cases,
address requirements that had
been developed for the core
component. The table below lists the research questions and
sub-questions to be addressed, a description of the data needed to
answer these questions, as well as the data collection approach and
data sources needed to collect that data. This relationship and
mapping will stay the same for Round II so that information can be
compared across both data collection efforts.
Table A: Research Questions/Sub-Questions, Data Needed, and Data Source
Research Question: Has the CCOE program improved comprehensive health service delivery within the targeted communities? |
||
Sub Questions |
Data Needs |
Data Collection Approach/Data Sources |
Does the CCOE program offer a full range of care including, but not limited to: acute, chronic, and preventive care, both primary and specialty services (including mental and dental health services, client education, health promotion, enabling, and ancillary services)? |
|
|
Does the service network successfully integrate and coordinate care with community partners?
|
|
|
Is a sustainable framework in place for providing CCOE services? |
|
|
Does the service delivery network demonstrate improvement in access to health care services for the targeted community? |
|
|
Does the service delivery network provide continuous care through range of services offered and the referral system? |
|
|
Is a physically identifiable clinical care center with permanent signage and the appropriate space and operational hour allocation available? |
|
|
Does the clinical care center have a schedule and procedures for identifying and counting women served by the CCOEs and tracking the cost of services delivered through the program? |
|
|
How is health care service delivery integrated with the other components of the CCOE program? |
|
|
Research Question: How has the CCOE impacted the training of lay and professional health care providers within the targeted community? |
||
Sub Questions |
Data Needed |
Data Collection Approach/Data Sources |
Are training activities provided on topics aimed at improving health services for women? |
|
|
Are training activities targeted towards a spectrum of lay and professional health care providers, including any special provider groups? |
|
|
Do training activities leverage existing community resources? |
|
|
How are training activities integrated with the other components of the CCOE program? |
|
|
Research Question: What is the impact of the CCOE program on community-based research? |
||
Sub Questions |
Data Needed |
Data Collection Approach/Data Sources |
Are community resources (e.g., partners, advisory boards, other organizations) involved in the research development process? |
|
|
Are the research activities focused on improving women’s health? |
|
|
Are these activities expansions of previous efforts or new activities? |
|
|
How are research results used to improve women’s health? |
|
|
How are research activities integrated with the other components of the CCOE program (used in improving client care, program operations, referrals, etc.)? |
|
|
Research Question: What is the impact of the CCOE program on public education and outreach? |
||
Sub Questions |
Data Needed |
Data Collection Approach/Data Sources |
Are the educational materials and activities appropriate to the age of the targeted audience? |
|
|
Are the educational materials and activities appropriate to the culture/ethnicity of the targeted audience? |
|
|
Are the educational materials and activities appropriate to the gender of the targeted audience? |
|
|
Are the educational materials and activities appropriate for the literacy level of the targeted audience? |
|
|
Do the educational materials and activities address issues that are relevant to the community? |
|
|
Do the educational materials and activities address issues that are amenable to recipient behavioral modification? |
|
|
Is the selection of educational topics based on community input or feedback? |
|
|
Does the CCOE use non-CCOE community resources in the production or dissemination of educational materials? |
|
|
How are public education and outreach activities integrated with the other components of the CCOE program? |
|
|
Research Question: What is the impact of the CCOE program on leadership development among women? |
||
Sub Questions |
Data Needed |
Data Collection Approach/Data Sources |
Is a structured, comprehensive, long-term approach utilized for conducting leadership activities for young girls/women in the community? |
|
|
Do CCOE activities support promotion of women and minorities into positions of leadership?
|
|
|
Are CCOE activities supporting promotion and retention of women and minorities in the health professions? |
|
|
Are mentoring initiatives in place to interest young women in careers in health care? |
|
|
Are leadership training and skills development opportunities available for women in the community? |
|
|
Is the CCOE providing opportunities for women to take leadership roles? |
|
|
Are leadership activities targeting young women/girls provided? |
|
|
Are leadership development activities integrated with other components of the CCOE program? |
|
|
Research Question: Has the CCOE program replicated successful models and strategies? |
||
Sub Questions |
Data Needed |
Data Collection Approach/Data Sources |
Is the CCOE maintaining a sustained interaction with another community? |
|
|
Has the CCOE contributed materials for a “How To” Manual? |
|
|
Has the CCOE intervened with a professional organization to further the cause of women’s health? |
|
|
Has the CCOE site-visited another organization to provide “How To” training? |
|
|
Has the CCOE hosted an on site training session? |
|
|
Has the CCOE participated in any regional meetings where the program or program elements were showcased? |
|
|
Has the CCOE participated in any national meetings where the program or program elements were showcased? |
|
|
Has the CCOE developed and/or disseminated any technical assistance materials that could improve the CCOE program? |
|
|
Has the CCOE showcased any lessons learned at a non-OWH meeting? |
|
|
How are technical assistance activities integrated with the other components of the CCOE program? |
|
|
Research Question: None – Overarching Program Requirements |
||
Sub Questions |
Data Needed |
Data Sources |
Are services women-centered, and culturally, and linguistically appropriate?
|
|
|
Is a CCOE advisory board that includes representatives from the community partners in place?
|
|
|
Are mechanisms in place to create an awareness of the CCOEs existence and services to the community? |
|
|
Does the distribution of funds among CCOE staff contribute to a positive outcome for the program? |
|
|
1 As cited in the Federal Register (January 22, 2001), Vol. 66, No.14
2 No longer a CCOE program goal, consolidated into remaining goals
3 No longer a CCOE program goal, consolidated into remaining goals
4 Program requirements are not considered a core component, but are included as a study research question to address non-component related requirements for the CCOE program.
File Type | application/msword |
File Title | Evaluation Methodology for the |
Author | Larnie Yuson |
Last Modified By | DHHS |
File Modified | 2006-12-18 |
File Created | 2006-12-18 |