0990-chc ss

0990-chc ss.docx

Evaluation of the Office on Women's Health (OWH) Coalition for a Healthier Community (CHC) Initiative

OMB: 0990-0443

Document [docx]
Download: docx | pdf

10-21-2015



Supporting Statement for

Evaluation of the Office on Women’s Health (OWH)

Coalition for a Healthier Community (CHC) Initiative





A. Justification



  1. Circumstances Making the Collection of Information Necessary

This collection is to provide data for the national evaluation of the U.S. Department of Health and Human Services (HHS), Office on Women’s Health (OWH) Coalition for a Healthier Community (CHC) Initiative. The initiative provides 10 communities with grants to support coalitions in implementing gender-based public health systems approaches, evidence-based health interventions, and outreach and education activities to reduce barriers to and enhance facilitators of improvements in women and girls’ health. Each of the grantees has implemented an IRB-approved local evaluation; however, OWH is seeking to collect core data across grantees to examine the extent to which the Government’s investment has resulted in achieving OWH-related Healthy People 2020 priorities and yields lessons learned upon which to plan future initiatives related to its mission. The proposed collection includes plans for interviews and surveys with staff, volunteers, and program participants in the 10 communities where the grants were funded; and review of secondary data sources such as progress and annual reports. Because there are human participants involved in this evaluation, legal and administrative requirements that necessitate the collection are detailed in Section 301 of the Public Health Service Act (42 U.S.C.241). This law provides the authorization for the HHS Secretary to ensure “protection of privacy of individuals who are research subjects” and provides that the Secretary “may authorize persons engaged in biomedical, behavioral, clinical, or other research (including research on mental health…) to protect the privacy of individuals who are the subject of such research by withholding from all persons not connected with the conduct of such research the names or other identifying characteristics of such individuals. Persons so authorized to protect the privacy of such individuals may not be compelled in any Federal, State, or local civil, criminal, administrative, legislative, or other proceedings to identify such individuals.” A copy of the legislation is attached (Attachment 1).





  1. Purpose and Use of Information Collection

Purposes of this information. Information will be collected for the national evaluation of the OWH CHC initiative and will come from various data sources to examine the evaluation questions posed by OWH. These sources include: the grantees’ Project Directors, Project Coordinators, Local Evaluators, and Coalition Members; selected Community Leaders familiar with the local coalition; a subset of Coalition Participants (who participated in evidence-based health interventions); and other Community Members who were indirectly exposed to the initiative’s activities through coalitions’ outreach and education activities. Information will be collected from these sources about the gender-based and public health systems approaches that were used to implement the initiative in their communities; how well these approaches were received in the communities; barriers to and facilitators of successful implementation; perceptions of the coalitions’ structure and functioning; perceptions of the coalitions’ impact on individual-level and community-level health indicators related to Healthy People 2020 Objectives; the cost effectiveness of the approaches; and the extent to which the coalitions, in whole or in part, are sustainable beyond the grant period. This information will be used by OWH to determine the extent to which gender-based, public health systems approaches are cost-effective strategies to reduce gender disparities and improve women and girls’ health.

Users of the information include the OWH, its federal interagency partners concerned with reducing gender disparities and improving women and girls’ health, the grantee organizations and their community partners, and the scientific community interested in cost-effective, public health systems approaches and evidence-based strategies for reducing gender disparities and improving women and girls’ health.

For example, the OWH uses information from quarterly progress reports and end-of-year reports to monitor the progress of the grantees. Countless federal programs have, no doubt, used this approach to evaluation, but by the end of the grant period, very little evidence has been/is available to document outcomes related to the Government’s interests, such as progress on addressing Healthy People 2020 Objectives, or to guide the Government’s/public’s future investment in such initiatives. Summary analyses of cost-effectiveness data (that will result from review of the economic evaluations conducted by IRB-approved local evaluations) can provide a complementary source to qualitative interviews of grantee staff and participants to inform OWH’s and other agencies’ decisions about continued or expanded support for such initiatives. The planned collection of data from multiple sources (grantee staff as well as community members) through multiple methods (interviews, surveys, document reviews) in the same site will provide greater confidence in the results to the Government and other stakeholders through triangulation of the data rather than reliance upon a sole source, single method, or exclusive use of progress reports and annual reports from the grantee staff. Collecting data only from grantee staff through grantee documents that are available, poses the potential dire consequence of not collecting other credible evidence to justify OWH’s conclusions about the initiative’s effectiveness. Reports of satisfaction by participants and partners or “feel-good” reports of grantees’ success in implementing plans are not sufficient data upon which to make decisions about the public’s limited resources. The data collected for each grantee site will also be shared with the grantee organization as a basis for sustainability planning, such as in applications for funding from other sources, forging or strengthening partnerships to continue beyond OWH CHC funding, and improving programs made possible by the initiative.

Thus, data collection will yield several products for multiple users: a cost-effectiveness summary analysis across sites based on the local evaluation reports to inform OWH’s decision making on future investments; an assessment of grantees’ sustainability planning utilizing a standard survey based on previous federal initiatives to assist grantees and funders in understanding factors that predict sustainability; an inventory of policy changes and other systems-level changes that positively influence women and girls’ health; and an assessment using coalition member surveys of the factors that contribute to coalitions’ effectiveness in improving health outcomes in the community, particularly for women and girls. For the scientific and practitioner communities, potential uses include information about evidence-based interventions (EBIs) that can be used to reduce gender disparities and the EBIs’ effectiveness for subgroups in communities of diverse gender, racial, ethnic, linguistic, socioeconomic, and geographic backgrounds. The factors that contribute to coalition effectiveness and sustainability should also be valuable to practitioners and funders interested in health equity for women and girls’ as well as for men and boys. This OWH CHC national evaluation is also expected to reveal where there are gaps in knowledge about how to implement gender-based, public health systems approaches that are cost-effective and improve women and girls’ health.



  1. Use of Improved Information Technology and Burden Reduction

To reduce burden, HHS/OWH has worked with its contractor conducting the national evaluation and the grantees to identify opportunities to implement Web-based surveys. For the surveys with the largest numbers of respondents (the Key Persons, Coalition Members, and Community Leaders Survey; and the Coalition Participants and Community Members Survey), the contractor will use an online survey vendor, such as SurveyMonkey® to improve this information collection. This will occur for the two core instruments—the one for key persons (200 persons) and the other for coalition participants and community members (510 persons). In addition to reducing burden on the respondents by incorporating skip patterns so they do not have to read through items that are not applicable, the online tool collates open-ended responses for each item; generates descriptive data for all items real-time; and auto-generates quantitative datasets and exports them into an Excel file that can be read and manipulated by analytic software packages such as SPSS, SAS, or STATA for later quantitative analyses. These electronic features reduce the time between data collection and analysis phases and eliminate the need for extra data management steps. Electronic surveys also ensure more accurate data in that evaluation staff members do not need to decipher handwriting, code from paper-and-pencil instruments, nor undertake dual-entry or interrater reliability procedures for data entry. The Grantee Annual Report on Coalition Functioning, Cost-effectiveness, and Sustainability is comprised of items from standard instruments in the field; and will be completed only by project directors who report the ratings for their site. These tools are available online and as word-processed documents, so that the project directors can use these at their sites, because only a small number of respondents (n=10) are involved.

Electronic surveys are not used to collect the information on the discussion topics for the individual interviews with project directors, project coordinators, local evaluators, and coalition co-chairs, because these items are primarily open-ended and would unduly burden the respondents. Instead, the contractor will conduct the discussions by telephone and, with interviewees’ permission, digitally (audio) record the telephone interviews. This will reduce respondent burden in that they do not have to enter large amounts of text into a survey (hard copy or electronic), nor fax, mail, or email their responses; and this will facilitate the contractor’s transcription and coding of large amounts of qualitative data. Qualitative software coupled with manual review of notes can facilitate more accurate reporting, because the recording can be used as a backup, and further reduce respondent burden in that contractors will not have to conduct follow up calls with the respondents to clarify notes. All data, including notes, will be stored electronically as digital files, word-processed files, or data files to reduce paperwork. Coding is also managed on-screen using Microsoft Word and Excel indexing tools.



  1. Efforts to Identify Duplication and Use of Similar Information

To avoid duplication and use of similar information, OWH and its contractor worked with the grantees using participatory research methods to determine the extent to which data needed to answer the evaluation questions were already available in quarterly progress reports, end-of-year reports, local evaluation reports, OWH/contractor site visit reports, grantees’ cost-effectiveness analyses, and grantees’ reports to OWH on GPRA data. Discussions with the grantees were held at two in-person grantee meetings, during grantee site visits (one to each site), and quarterly grantee conference calls for evaluation planning and technical assistance. After review of two years’ of grantees quarterly reports, annual reports, evaluation reports, site visits, and supporting documents, the OWH and contractor developed or identified existing instruments for this data collection. These include two self-administered online surveys (one for key persons and the other for coalition participants and community members), a telephone interview discussion guide for key persons, and one self-administered tools completed by project directors to report on their site’s coalition functioning, cost-effectiveness, and sustainability planning. These instruments were developed to fill any gaps where data were not already being collected to address the national evaluation questions. The national evaluation contract lagged the award of the grants (and, thus, national evaluation planning and implementation lagged local program implementation and evaluation). We have included the Discussion Guide for the Telephone Interviews with key persons as a means to capture retrospectively what was occurring in the sites prior to funding and during the initial years of implementation, and will implement that discussion guide at the end of the program to capture changes/impacts. During site visits, we learned that the grantees were not systematically capturing data on coalition functioning or effectiveness; nor were they documenting their policy change or other system-level changes or sustainability plans. Such information is key to OWH’s planning for future initiatives. Therefore, the Grantee Annual Report on Coalition Functioning, Cost-effectiveness, and Sustainability and the Discussion Guide for Telephone Interviews will allow us to capture these much-needed data. All other data not captured in the instruments for which we are requesting this review can be abstracted or extracted from grantees’ progress reports, annual reports, and GPRA reports to OWH.

  1. Impact on Small Businesses or Other Small Entities

Some members of the coalition will represent small businesses such as the two grassroots organizations (community-based nonprofits) that were required to be included as coalition members and service delivery partners under the terms of each grant. The information being requested has been held to the absolute minimum required for the intended use of these data. All respondents have the option to refuse to answer any items or to not participate without penalty of losing any services or connections to the funded coalition or agency. These coalition members will be included for the one-hour interview discussions that will be scheduled at their convenience, and complete the 20-minute online survey at their leisure.



  1. Consequences of Collecting the Information in a Less Frequent Collection

The proposed plan is for at least two collections of the interview data within a six-month period in order to document changes over time—i.e., capture perspectives on what was happening prior to the initiative through the time of the first data collection as well as perspectives on changes that have occurred over time as a result of the initiative by the time of the second collection. The comparative analysis is essential to assess the Government’s investment. Although the retrospective look at the initiative could be captured in a one-time assessment along with the end-of-project data, it would pose additional burden on the respondents for recall as well as time to complete the additional discussion items. The Government will not have sufficient information upon which to inform decision making about what to request in future budgets or how to allocate authorized funds. “There are no legal obstacles to reduce the burden.”



  1. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

All guidelines are met and this request fully complies with the regulation.



  1. Comments in Response to the Federal Register Notice/Outside Consultation

A 60-day Federal Register Notice was published in the Federal Register on August 18, 2015, vol. 80, No. 159; pp. 50015-16 (a PDF of the Federal Register notice is attached). There were/were no public comments.

Describe efforts to consult with persons outside the agency.



1. Discussions with the grantees were held at two in-person grantee meetings (October 2012 and April 2014), during grantee site visits (one to each site during 2012-2013), and quarterly grantee conference calls (between December 2012 to April 2014) for evaluation planning and technical assistance. After a systematic literature review, the contractor developed a draft logic model to guide these discussions and presented operational definitions for key concepts in the initiative at the systems-level. Grantees used these opportunities to share how they were approaching their local evaluations and what information they were collecting that could be provided in aggregated (de-identified) ways for secondary analyses by the national evaluation contractor. These individuals included the following project directors, project coordinators and local evaluators for the grantee sites:

Individuals Consulted Outside of the Agency

Outside Individuals Consulted from Grantee Organizations

(During Grantee Meetings, Site Visits, and Evaluation Conference Calls)



Brandywine Counseling

D.Personti@brandywinecounseling.org

Sara Steber (sasteber@gmail.com)







Domestic Violence Action Center/Stop the Violence, Inc.

Nancy Kriedman (nancik@stoptheviolence.org)

Cindy Spencer (cindys@stoptheviolence.org)

Christopher Yanuaria (chrisy@stoptheviolence.org)

Jan Shoultz (shoultz@hawaii.edu)





Drexel University School of Medicine

PSC Ana Nunez

Serita Reels (sreels@DrexelMed.edu)

Candace Roberts drexelmed.edu



Family League of Baltimore, Inc.

Robin Truiett-Theodorson (rtruiett@flbcinc.org)

Stacey.tuck@baltimorecity.gov

Rebecca.Dineen@baltimorecity.gov

JBowie@jhsph.edu





University of Illinois at Chicago

PSC sgeller uic.edu

Kris Zimmermann (kzimme3@uic.edu)

PMoehring@s7hd.org

HJrisser@uic.edu

Ellen Palsey (grad student)



National Kidney Foundation of Michigan

Arthur Franke (afranke@nkfm.org)

Jodi Burke (jburke@nkfm.org)

Sandy Waddell

Patrick Kelly (rpkel@umich.edu)

Lauriel@umich.edu



St. Vincent Healthcare

Tracy Neary (tracy.neary@svh-mt.org)

April Keippel (april.keippel@svh-mt.org)

Shawn.Hinz@riverstone.org

ECiemins@billingsclinic.org









Thurston County Department of Health

Hawkinc@co.thurston.wa.us

Kateri Wimsett (wimsetk@co.thurston.wa.us)

Mary Ann O'Garro (ogarrom@co.thurston.wa.us)

PSC Kathleen Digre



University of Utah

Patricia.Murphy@nurs.utah.edu

PSC Leanne Johnston

Sara Simonsen (sara.simonsen@utah.edu)

Brenda Ralls (BRALLS@utah.gov)





Yale University

Kia.Levey@yale.edu

Megan Smith (megan.smith@yale.edu)

Heather.Howell@yale.edu

MDamiani@newhavenct.net

KHarris@cfgnh.org





2. The contractor retained the services of three gender experts to review evaluation plan, the performance measurement plan, and logic model; and these experts also provide input on gender-specific measures; and reviewed quarterly progress reports and end-of-year (annual) reports to determine the extent to which data would be present in these reports to address the evaluation questions. Gender experts also participated in discussions with OWH and the national evaluation contractor regarding gender-based public health systems approaches; reviewed and provided input on the annotated bibliography for the systematic literature review; participated, as needed, on the evaluation quarterly conference calls with grantees and other evaluation-related webinars involving the grantees and OWH that required gender analysis expertise for evaluation issues. These experts also reviewed the draft data collection instruments and provided input on survey items used to collect gender-specific data. These experts included:



  1. Miguelina León, MSW (miguelina.leon2@verizon.net), Independent Consultant, former OWH consultant on gender-based programming

  2. The Iris Group, Inc., c/o Mary Mulhern Kincaid, MS, DrPH (mkincaid@irisgroupinternational.com); an independent, woman-owned nonprofit specializing in health policy and strategies for gender equality and gender-based programming; former OWH consultant

  3. Elaine Walker, Ph.D., (Elaine.walker@shu.edu), Professor at Seton Hall University, Education Research, Assessment & Program Evaluation Track; Independent Consultant; former OWH consultant on gender-based programming and evaluation; editor of OWH’s staff, grantees’ and contractor’s articles in special issue of the peer-reviewed journal Evaluation and Program Planning on the OWH CHC initiative.



3. A subject matter expert in evaluating community-based partnerships and coalitions was retained as a consultant by the contractor regarding approaches to capture coalition functioning and effectiveness. This consultant was:



Robert M. Goodman, PhD, MPH, MA (rmg@indiana.edu), Professor of Allied Health Science and former Dean, School of Health, Physical Education and Recreation, Indiana University-Bloomington), expert in evaluating public health initiatives that rely on partnerships, coalitions, and collaborations





4. Two health economists, one on the contractor’s team (Dr. Patrick Richard) and another steeped in economic evaluation of health interventions who also serves as the local health economist for one of the CHC grantees (Dr. David Hutton), provided a webinar (2/12/14) on possible health economic indicators from which we developed the tool to summarize the parameters of the grantees’ cost-effectiveness analyses.



  1. Patrick Richard, PhD (pat042407@gmail.com), health economist specializing in cost-effectiveness analysis, The MayaTech Corporation

  2. David Hutton, PhD (dwhutton@umich.edu), health economist specializing in cost-effectiveness analysis, Assistant Professor, Health Management and Policy, University of Michigan, Ann Arbor; consultant to National Kidney Foundation of Michigan, OWH CHC grantee.



5. Other public comment was received at professional meetings where OWH and contractor staff presented lessons learned from the site visits. These included annual meetings of the:



  1. CityMAtcH Annual Meeting, September 2013, Phoenix, AZ (OWH staff: Presenters: Stephanie Alexander, MPH; MayaTech staff: Shelly Kowalczyk, MSPH, CHES).

Audience feedback from maternal and child health practitioners and epidemiologists



  1. American Public Health Association, November, 2014, New Orleans, LA

Community Health Planning and Policy Development section with grantee presenters

OWH co-presenters/co-authors: Stephanie Alexander, MS; Adrienne Smith, PhD, MS, CHES; MayaTech: Suzanne M. Randolph, Ph.D., Shelly Kowalczyk, MSPH, CHES; Veronica Thomas, PhD; and two grantees

Audience feedback from: community health planners, policy development specialists, and community members of community-based participatory research projects.



  1. American Evaluation Association, October, 2014, Denver, CO

Community-based approaches to integrating gender analysis into evaluation of public health systems approaches to eliminate gender disparities: Lessons from a national evaluation

OWH co-presenters/co-authors: Stephanie Alexander, MS; Adrienne Smith, PhD, MS, CHES; MayaTech: Suzanne M. Randolph, Ph.D., Shelly Kowalczyk, MSPH, CHES; Veronica Thomas, PhD

Audience feedback from program evaluators who specialize in systems-level evaluations and evaluation of public health initiatives


  1. Explanation of any Payment/Gift to Respondents

There are no payments or gifts to respondents.



  1. Assurance of Confidentiality Provided to Respondents

Data will be kept private to the extent allowed by law. Except for the key persons who will be interviewed and who hold specific roles such as project director, project coordinator, chair or co-chair of the funded coalitions, and the local evaluator, all other individuals will not have identifiable information. Moreover, for the other categories of participants, more than one coalition member, community leader, participant, and community member who was not a participant will be included in the subsamples; and no names or initials are requested on any of the instruments. For those key persons who might be identifiable because of their roles, their names will not be used in evaluation reports, and they will be assigned a unique identifier used to label their interview transcripts and digital recordings. Access to the list linking the unique alphanumeric code to the names of individuals will be limited to the contractor’s project director, deputy project director and one other individual responsible for quality assurance of data. The password-protected file will be maintained electronically in an encrypted folder on the contractor’s local access network (LAN). None of the online surveys will require unique identifiers; the online survey software automatically assigns a numeric code sequentially to submitted forms.



Respondents rights are explained to them in the written informed consent form for the key persons who will be interviewed (some of whom might be identifiable by role in a particular site but will not be identified by name); and, for the online surveys for all participants, consent information and participants’ rights are included in the preambles for each of the online surveys. After reading the preamble (that includes information about whom to contact with questions or clarifications), participants are electronically guided to answer whether they read the consent information, understood it (and, if needed, had it explained to them), and whether they were agreeing to participate in the survey. This procedure is a more active form of consent than the passive consent inferred from a participant just moving forward to the survey items immediately after linking to the survey. For both consent procedures (the signed written consent for interviews and the active preamble consent for the online surveys), participants are informed that their information will be kept private to the extent allowed by law. The Institutional Review Board of the contractor (the MayaTech Corporation of Silver Spring, MD) has reviewed and approved the evaluation project (MayaTech IRB #2015-01, dated August 14, 2015). MayaTech’s Federal-wide Assurance of Compliance (No. FWA00012366, which expires on February 28, 2018) is on file with the Office for Human Research Protections (OHRP).



  1. Justification for Sensitive Questions

Respondents’ social security numbers (SSNs) are not needed nor requested.

We understand that OMB considers asking a respondent about their race/ethnicity as sensitive. For the online surveys of community participants, this question is asked to document the extent to which the participant sample reflects the subgroup that was served by the coalition and the community in which they live. Aggregated race and ethnicity data are reported by grantees in the quarterly progress and annual reports as HHS requires (i.e., in all HHS data collection instruments). Knowing the distribution of race/ethnicity in the survey sample will help OWH better understand the patterns of response and how these responses may or may not reflect the perspectives of those for whom the initiative was intended. There are no other sensitive questions or requests for sensitive information.



  1. Estimates of Annualized Hour and Cost Burden

The table below shows the type of respondents, number of respondents, frequency of response, and annual hour burden. This request for approval covers more than one form, and we have provided separate hour burden estimates for each form in addition to the aggregate of the hour burdens. The burden for the key persons’ interviews was estimated from one administration with a mock participant with a background similar to a project director level individual in this initiative. Some participants will have more knowledge and familiarity with the initiative; however, we estimated one hour on average for these key persons, although those less knowledgeable or familiar with the initiative might take less time. The estimate for the key persons and other coalition members and community leaders to complete the online survey was estimated with the same person used to test the time for the interview. The estimate for the coalition participant and other community members’ online survey was estimated from published literature from which some survey items were derived and further estimated from two undergraduate students’ completion of the draft survey using a word-processed version of the form. For forms that require less than 1 hour to complete, the burden is displayed in the number of minutes to complete the form over 60 with no decimals (e.g., 20 minutes is displayed as 20/60). If a form requires an exact number of hours, the burden is displayed as the number of hours. The hour-burden estimate for the Grantee Annual Report on Coalition Functioning, Cost-effectiveness, and Sustainability form includes the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information.



Table 12A. Total Estimated Annualized Burden - Hours





Form Name

No. of

Respondents

No. of

Responses

per

Respondent

Average

Burden per

Response

(in hrs)



Total Burden Hours

1--Key Persons Discussion Guide for Telephone Interviews

90

2

1

180

2—Key Persons, Coalition Members, and Community Leaders Online Survey

200

1

20/60

67

3--Coalition Participants and Other Community Members Online Survey





510





1





20/60





170

4--Grantee Annual Report on Coalition Functioning, Cost-Effectiveness, and Sustainability Planning






10





2





2





40

TOTAL


457



12B. Estimates of annualized cost to respondents for the hour burdens for collection

The cost to the federal government for contracting out for information collection activities is not included here. Instead, this cost is included in Item 14. The Department of Labor website was used to determine appropriate wage rates for respondents.





Table for 12B. Estimated Annualized Cost to Respondent for the Hour Burdens





Form Name



Total Burden Hours



Hourly

Wage Rate*



Total Respondent Costs

1--Key Persons Discussion Guide for Telephone Interviews



180



$50.99



$9, 178

2—Key Persons, Coalition Members, and Community Leaders Online Survey



67



$50.99



$3,416

3--Coalition Participants and Other Community Members Online Survey



170



$15.00



$2,550

4--Grantee Annual Report on Coalition Functioning, Cost-Effectiveness, and Sustainability Planning





40





$58.68





$2,347

Total

457


$17, 491





















*Source: U.S. Department of Labor DOL), Bureau of Labor Statistics. Based on median hourly rates for occupations closest to each category of respondent. Available at: http:// http://www.bls.gov/oes/current/oes_nat.htm



Basis of estimates for hourly wage: For the Key Persons: Project Directors category, we used the median hourly wage for the labor category “Top Executive,” estimated at $58.68 from the DOL website. For the Key Persons: Others, we averaged the median hourly wage for all the occupations using estimates closest to the respondents in the respective groups: local evaluators (health economists) estimated at $50.62 from the website; project coordinators and coalition members (certified health education specialists, because several of the grantees have program coordinators who have public health and health education degrees) estimated at $24.24; and for coalition chairs/co-chairs and community leaders, we estimated at the top executive median hourly rate of $58.68 as many of these are executive directors or deputy directors at local organizations. For the Coalition Participants and Other Community Members category, we estimated at $15.00 per hour which considers a range of occupations across various states and regions including retail, healthcare support workers, teachers, and clerical staff (even though some participants and community members will have retired from management occupations).



  1. Estimates of Other Total Annual Cost Burden to Respondents or Recordkeepers/Capital Costs

None. All equipment, software, and services were part of customary and usual business or private practices for the grantees, contractor, and OWH.



  1. Annualized Cost to Federal Government

The total cost to the Federal Government for this information collection includes the contract costs for the national evaluator as well as the personnel cost to the Federal Government for federal employees involved in oversight and analysis. This cost also includes operational and other expenses and other expenses that would not have been incurred without this collection of information.



Estimated government costs for contracted data collection. This cost includes the total contract costs for the national evaluator over five years for the data collection and analysis task as well as the average annual cost (annualized cost). For this collection of information (Task 7), the national evaluator has been contracted for five years for a total cost of $586,198 and an average annual cost (annualized cost) of $117,240. The total contract cost is the sum of costs for each year for labor, other direct costs (reproduction, supplies, travel, consultants, telephone calls, and survey vendor fees), indirect costs, general and administrative costs, and fees. The national evaluator has been contracted for 5,161 hours across the five-year contract for an average annual of 1,032 labor hours. The following table summarizes the total and annualized cost and labor hours.

Annualized Cost to Federal Government:

Contracted Data Collection Hours and Costs by Year and Average Annual Cost

(Contractor: The MayaTech Corporation Task 7: Data Collection & Analysis)



Contract Year



Annual Cost



Annual

Labor Hours

Year 1: Sept 2012-Sept 2013

$206,164

1,691

Year 2: Sept 2013-Sept 2014

$ 89,929

846

Year 3: Sept 2014-Sept 2015

$ 88,760

847

Year 4: Sept 2015-Sept 2016

$106,577

951

Year 5: Sept 2016-Sept 2017

$ 94,768

826

Total

$586,198

5,161

Average Annual

$117,239

1,032





  1. Explanation for Program Changes or Adjustments

“This is a new data collection.”



  1. Plans for Tabulation and Publication and Project Time Schedule

Time schedule. Within one month of OMB approval, we expect to begin the first collection for Form 1—Discussion Guide with Key Persons; and complete these collections within two months of approval. Within two months of OMB approval, we expect to begin the online surveys with key persons, coalition members, and community leaders (Form 2); and complete these collections within three months of approval. Within three months of OMB approval, we expect to begin the online surveys for coalition participants and community members (Form 3); and complete these surveys within four months of approval. Within three to four months of OMB approval (July to August 2016), we will begin and complete the second key persons’ discussion interviews (Form 1—second collection) and project directors’ annual reports on coalition functioning, cost-effectiveness, and sustainability (Form 4). Within four months of OMB approval, we will also review OWH quarterly and end-of-year reports to summarize process evaluation data. The final report will be submitted by September 26, 2017. The following table summarizes the projected time schedule.



Projected Time Schedule





Instrument

By month from OMB Approval

Month 1

Month2

Month 3

Month

4

Month

5

1—Key Persons Discussion Guide for Grantee Site Interviews

Shape1



Shape2


2—Key Persons, Coalition Members, and Community Leaders Survey


Shape3




3--Coalition Participants and Other Community Members Survey



Shape4



4—Grantee Annual Report on Coalition Functioning, Cost-effectiveness, and Sustainability


x



x



Thus, approximately four months are needed for data collection to implement the first discussion interview with project directors and other key persons, and implement a second interview, key persons’ online survey, and sustainability assessment prior to the end of their grants in August 2016. A maximum, three-year clearance is requested to include reporting and dissemination.

Plan for tabulation. There are no planned complex analytical techniques. Our analysis plan includes descriptive data for all project variables, and, as appropriate, repeated chi-square and independent sample t-tests for quantitative data comparing each grantee’s first collection of the key persons’, coalition members’, and community leaders’ data in interviews and surveys with the same data in the second collection. In some instances, we also compare data for the same or similar items in different data collection instruments by respondent type (project director, coalition, member, etc.). For quantitative data with appropriate sample sizes, we will use an alpha of p<.05 as the level of significance. We will also subject qualitative data to analysis for recurring themes within grantee sites and across sites/grantees. We will generate tables showing these themes within and across grantee sites (see Attachment 2 for examples).

A critical date by which data collection must begin is April 15, 2016 in order to collect data twice from the grantees. Two data collections will reduce the burden on the grantees in that the retrospective data can be collected in the first collection and the data about the impact of the initiative can be collected in the second collection using the discussion guide (Form 1) and online surveys with coalition members and community leaders (Form 2). Because the evaluation contract lagged the implementation of the grants, and because of the participatory nature that required input from the various stakeholders, an accelerated data collection using the retrospective approach will permit OWH to collect data that can be used to assess the overall impact of the initiative across its ten grantees before their grants end. Qualitative data will be displayed with thematic labels for each grantee and by role of respondent, as well as across sites by grantee. Themes will be coded using operational definitions of the system-level change constructs (e.g., the communities’ increased awareness of gender norms that influence programming; receptivity to, policies that are facilitative of, and capacity to deliver gender-based programs; and improved coordination, collaboration, leveraging, and sustainability planning. Attachment 2 includes illustrative table shells showing how data will be tabulated for selected quantitative and qualitative data.

Plan for publication. OWH has contracted with the national evaluator to produce three manuscripts for peer review across the five-year contract period. The national evaluator has identified outlets such as the American Journal of Public Health, Journal of Community Psychology, Women’s Health, and other journals as possible candidates to which to send manuscripts. By September 26, 2016, the national evaluator will prepare one manuscript for publication based on the lessons learned in implementing the data collection; and by September 2017, submit two additional manuscripts reporting on the individual-, coalition-, and community-level impacts of the CHC initiative based on the proposed data collection and document review of secondary sources.



  1. Reason(s) Display of OMB Expiration Date is Inappropriate

Not applicable



  1. Exceptions to Certification for Paperwork Reduction Act Submissions

“There are no exceptions to the certification.”



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWindows User
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy