Project LAUNCH Cross-Site Evaluation
OMB Information Collection Request
0970 - 0373
Supporting Statement
Part A
5.21.13
Submitted By:
Office of Planning, Research and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
7th Floor, West Aerospace Building
370 L’Enfant Promenade, SW
Washington, D.C. 20447
Project Officer:
Laura Hoard
A1. Necessity for the Data Collection
The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval for collection of data from an additional fourth cohort of 11 cooperative agreements funded in September 2012 for the cross-site evaluation of Project LAUNCH (Linking Actions for Unmet Needs in Children’s Health). The request to collect data from these 11 cooperative agreements is in addition to OMB-approved data collection at 24 cooperative agreements funded by the Substance Abuse and Mental Health Services Administration (SAMHSA) under Project LAUNCH (OMB Control# 0970-0373): 6 cooperative agreements in September 2008 (Cohort 1), 12 additional cooperative agreements in September 2009 (Cohort 2), and 6 more cooperative agreements in September 2010 (Cohort 3). Data collection for Cohort 4 (funded September 2012) grantees will follow the same schedule as the other Cohorts.
The purpose of Project LAUNCH is to promote healthy development and wellness in children birth to eight years of age. Project LAUNCH is intended to address issues in the child service system by enhancing systems coordination, integrating child behavioral health services with other health services, and incorporating evidence-based programs to address children’s healthy development. Project LAUNCH grantees in Cohorts 1, 2, and 4 focus on systems-level development at the state or tribal level and in one designated community. Cohort 3 grantees focus on systems-level development in the designated community and are expected to coordinate their efforts with activities at the state level. Additionally, within the designated or “local” community, all grantees are required to implement promotion and prevention activities within each of the five categories: 1) mental health consultation in early care and education settings; 2) developmental assessments across service settings; 3) integration of behavioral health into primary care; 4) family strengthening and parenting skills training; and 5) home visitation.
Project LAUNCH is authorized under Section 520A of the Public Health Service Act (42 U.S.C. 290bb–32) and addresses the Healthy People 2020 topic area 18 (for Mental Health and Mental Disorders).
This submission to OMB is a revision to the OMB Control #0970-0373, approved on January 8, 2010 and revised July 11, 2011. Project LAUNCH has added a fourth Cohort of 11 grantees. Data collection for the 11 Cohort 4 grantees will follow the same schedule as other cohorts.
There are no legal or administrative requirements that necessitate the data collection activities. ACF is undertaking the collection at the discretion of the agency.
A2. Purpose of Survey and Data Collection Procedures
Overview of Purpose and Approach
The cross-site evaluation collects information from Project LAUNCH grantees related to state, tribal, and community systems development; implementation of evidence-based services in local communities; and service system outcomes for children, families, and providers.
The cross-site evaluation will describe how Project LAUNCH efforts affected state and tribal policies pertaining to children and families, led to expansion of existing services or implementation of new services, and increased service coordination. Although this is not an impact evaluation, findings will also document outcomes for children, families, and providers touched by Project LAUNCH.
Research Questions
The cross-site evaluation of Project LAUNCH is intended to address the following four evaluation questions:
Q1: What are the system-level changes at the state/tribal level? Are there:
Improved coordination and collaboration across agencies serving young children and families;
Sustained implementation of a coordinated, family-centered, culturally competent child-serving system;
Improved infrastructure, legislation, and other policies;
Increased public education outreach and awareness; and
Sustained funding and maintenance of child-serving systems?
Q2: What are the system-level changes at the community/local level? Are there:
Improved coordination and collaboration across agencies serving young children and families;
Sustained implementation of a coordinated, family-centered, culturally competent child-serving system;
Improved infrastructure, legislation, and other policies;
Increased public education outreach and awareness; and
Sustained funding and maintenance of child-serving systems?
Q3: How have the child and family services in the community been enhanced? Have enhancements occurred through:
Workforce development;
Increased number of providers trained in the evidence-based prevention and wellness-promotion practices;
Increased provider knowledge about appropriate referrals; and
Providers with increased knowledge of child development and behavioral health?
Changes in provider practices:
Increased implementation of developmental screening and assessment in a range of primary care and early childhood settings;
Implementation/expansion of integration of mental health into primary care;
Implementation/expansion of mental health consultation for providers in a variety of settings;
Implementation/expansion of evidence-based prevention and wellness-promotion practices, including home visiting and family strengthening and parent training programs; and
Implementation of culturally-relevant, family-centered practices in a range of primary care and early childhood settings?
Increased number of children and families receiving high-quality services that meet their needs?
Q4: What are the health and well-being outcomes for children in Project LAUNCH communities? Are there:
Increased number of children reaching physical, social, emotional, behavioral, and cognitive developmental milestones; and
Increased number of children entering school ready to learn (including physical, social, emotional, behavioral, and cognitive readiness)?
Study Design
To address these questions, the cross-site evaluation will utilize three basic data collection components – site visits/telephone interviews, Web-based reporting on systems and services, and review of local grantees’ end of year evaluation reports to SAMHSA.
Site visits and telephone interviews: evaluation staff will interview state, tribal, and local staff to obtain information on the community context, service coordination, and systems infrastructure that support services.
Web-based reporting on systems and services: One component will focus on state, tribal, and community systems-level development. A second component will track aggregate information on children and families served under different service models (e.g., home visitation), including demographic information. Grantees will also report aggregate information on providers who participate in training or other activities offered by Project LAUNCH, and on provider practice and provider setting changes.
Review of outcome data from grantees’ end of year evaluation reports: The evaluation team will analyze this information to assess the effects of LAUNCH-supported services on three samples in the LAUNCH communities: (a) the provider workforce in child and family services, (b) parents/families, and (c) children birth to 8 years old. Outcome data will be culled from grantees’ annual reports, which analyze outcomes annually and cumulatively since the start of their grant period. The cross-site evaluation will conduct descriptive analyses that summarize the results generated and reported by the 35 local evaluations from their outcome studies. Because evaluation measures and designs vary across grantees (e.g., some look at pre- and post- measures, others use a comparison group), the results from the local evaluations will vary in terms of the strength of the evidence. The cross-site evaluation will categorize grantee outcomes in terms of “strength of evidence.”1 Strength of evidence” refers to the internal validity of the estimate of a difference, or how strongly we believe that the evidence indicates a program effect. Outcomes generated from quasi-experimental designs will be rated using the What Works Clearinghouse Procedures and Handbook. Outcomes generated from pre-post studies will be rated using the R-SEED (Review of Studies with Emergent Evidence Designs) Procedures and Handbook, which is in development by the cross-site evaluation. Consideration will be given to whether the outcome measure (baseline and post-test) meets outcome standards, the type of design used in the local evaluation, and the number of time points used for baseline and post-test measurement. Local evaluators will be given tables to complete about their outcome studies which will provide the information needed for categorization. A draft set of standards and the tables grantees will complete are included in Appendix H.
Table A.1 shows which data collection components will be used to address each of the evaluation questions.
TABLE A.1
Evaluation Questions and Data SOURCES
Evaluation Question |
Data Source |
||
Site Visits/ Telephone Interviews |
Semi-annual Web-based Reporting |
Grantee Local Evaluations |
|
1. System level changes at the state/tribal level |
X |
X |
X |
2. System changes at the community/local level |
X |
X |
X |
3. Changes in child and family services in the community |
X |
X |
|
4. Changes the overall development and wellness of children in the Project LAUNCH community |
|
|
X |
Analyses will employ a variety of methods, including descriptive statistics (means, percentages) and simple tests of differences across subgroups and over time (t-tests, chi-square tests). Most of the evaluation questions call for descriptive analyses, which can be answered by calculating averages and percentages of families, children, and providers participating in services; average scores on service delivery outcomes; and comparisons of these averages across grantees and across time. Cross-tabulations of program characteristics and family characteristics will also provide important information about differences across programs in the kinds of families they serve and how different families are served.
Universe of Data Collection Efforts
Data collection efforts will include the entire universe of Project LAUNCH grantees. The new request includes revision of the site visit and telephone interview guides that is meant to reduce respondent burden, as shown in Exhibit A.2 below. We have removed evaluators and ECCS Coordinators from the respondent category. We have also customized each guide to make the interview process as seamless as possible for the respondent. The new request also includes a format (table) for grantees to report their child and family outcomes (grantee specific) in a standardized manner in their evaluation reports.
The instruments to be used in the newly requested information collection include:
Revised: Site Visit and Telephone Interview Guides;
Previously approved: Electronic Data Reporting: Systems Measures;2 __
Previously approved: Electronic Data Reporting: Services Measures;a and
New: Outcomes Data Tables in End of Year Reports.
TABLE
A.2
COMPARISON OF NUMBER OF QUESTIONS BETWEEN PREVIOUSLY
APPROVED AND REVISED SITE VISIT AND TELEPHONE INTERVIEW GUIDES
Key Informant |
Original Number of Questions |
Number of Questions Revised Guide* |
COMMUNITY/TRIBE |
|
|
Child Wellness Coordinator |
86 |
66 |
Chair of Child Wellness Council |
85 |
35 |
Other Representative(s) from Child Wellness Council |
46 |
0 |
Providers/Local Stakeholder |
39 |
29 |
Evaluator |
47 |
0 |
STATE |
|
|
Child Wellness Coordinator |
76 |
37 |
ECCS Coordinator |
33 |
0 |
Chair of Child Wellness Council (Cohort 4 Site Visit only) |
68 |
49 |
Evaluator (if different from evaluator, above) |
41 |
0 |
*There are a minimum number of questions for some respondents (e.g., Child Wellness Coordinator, Chair of Child Wellness Coordinator). A small number of other questions may be asked depending on responses to gather additional data.
A3. Improved Information Technology to Reduce Burden
To reduce grantee burden, the proposed data collection uses a Web-based data entry system, previously developed for the Center for Mental Health Services (Decision Support 2000+ platform), to collect information in a uniform manner across grantees. This is a NET-based suite of online applications that provides the capacity for rapidly producing secure data capture and project collaboration portals tailored to the needs of SAMHSA evaluation and grantee monitoring. It is user-friendly and facilitates accurate and expeditious data collection.
To minimize burden in subsequent data reporting periods, information entered at the first data collection time point will be pre-loaded for each grantee. Grantees will designate individuals needing access to the system.
A4. Efforts to Identify Duplication
Because Project LAUNCH was a new initiative when the cross-site evaluation was funded, there have not been any similar cross-site data collection efforts. Although SAMHSA requires grantees to conduct local evaluations of their Project LAUNCH programs, the cross-site evaluation is designed to provide a more comprehensive examination of Project LAUNCH activities and outcomes across the communities served. To reduce the possibility of duplication of efforts between local evaluations and the national evaluation, the cross-site evaluation is designed to build upon and complement local data collection plans specified by Project LAUNCH grantees in their grant applications.
We will pre-populate some questions in the data reporting system using previously reported data after grantees have reported for the first time in the Web-based data reporting system. Examples of data that will be pre-populated include the program description, the target population, and the location in which services occur.
After a telephone interview occurring in a cohort’s first year, we will also pre-populate data on service strands and sustainability for subsequent site visit/telephone interviews related to these areas.
A5. Involvement of Small Organizations
No small businesses are impacted by the data collection for this project.
A6. Consequences of Less Frequent Data Collection
To address the study’s research objectives about how Project LAUNCH is implemented over time and how service provision is affected, telephone interviews or site visits will be conducted with grantees each year. However, because site visits are more time-intensive for grantees to coordinate and schedule, only one site visit will be conducted for grantees in each cohort’s second year of grant implementation, thus assuring a full year of services implementation prior to the site visit. Telephone interviews will be conducted in all other years to minimize grantee burden. Less frequent data collection would potentially introduce bias, because respondents may not be able to accurately recall and describe activities that took place more than a year prior to data collection.
Data about services provided; numbers of families served; system change efforts at the state, tribal, and local levels; and service system outcomes for children, families, and providers will be required semi-annually to correspond with SAMHSA’s semi-annual progress-reporting periods. Less than semi-annual data collection on services and systems change activities would be problematic for the cross-site evaluation. Delivery of services to children and families in Project LAUNCH communities is developmental, and semi-annual data points allow the cross-site evaluation to report on this development process and show cumulative counts of children and families served over time.
A7. Special Circumstances
There are no special circumstances for the proposed data collection efforts.
A8. Federal Register Notice and Consultation
Federal Register Notice and Comments
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity.
A revised Federal Register notice for the Cross-Site Evaluation of Project LAUNCH was published in the Federal Register Vol. 78, No. 36, Page 12328 on February 22, 2013. A copy of this notice is attached as Appendix I. During the notice and comment period, no comments were received.
Many individuals and organizations, including the Project LAUNCH Expert Consultant Group, were contacted for advice on aspects of the evaluation design and data collection instruments. Their feedback was obtained through an in-person meeting and telephone conversations. Members of the Project LAUNCH Expert Consultant Group are listed in Table A.3.
tABLE A.3
Membership of
Project LAUNCH EXPERT CONSULTANT Group
Individual |
Affiliation |
Beulah Allen, MD |
National Indian Youth Leadership Project |
David M. Chavis, PhD |
Community Sciences |
Paul Florin, PhD |
University of Rhode Island |
Stephanie M. Jones, PhD |
Harvard University |
Milton Kotelchuck, PhD |
Boston University |
Deborah Leong, PhD |
Metropolitan State College of Denver |
Judith S. Palfrey, MD |
Harvard Medical School |
Michelle Christensen Sarche, PhD |
American Indian Alaska Native Programs |
Ruth E.K. Stein, MD |
Albert Einstein College of Medicine |
Joseph Trimble, PhD |
Western Washington University |
Abraham H. Wandersman, PhD |
University of South Carolina |
A9. Incentives for Respondents
No incentives for respondents are proposed for this information collection.
A10. Privacy of Respondents
For each of the data collection methods, procedures will be in place to provide assurance of privacy to respondents to the fullest extent of the law.
Site visits and telephone interviews
During the site visits and telephone interviews with grantees, we will speak with a variety of state, tribal, and local project staff. Each respondent will be asked to sign a consent form to participate in an interview, and the consent form will explain the study procedures for assuring that the answers provided by the respondent will remain private. The consent form will explain that (a) participation in the interviews is voluntary, and there are no penalties for refusing to participate at any time during the interviews; (b) the respondent can refuse to discuss any topic; (c) data will be stored in de-identified files and (d) no names will be used in any evaluation reports. At the time of the interviews, respondents will be asked to sign two copies of the consent form. One will be retained by the cross-site evaluator, and the second copy will be given to the respondent for his/her files. A copy of the interview consent form is provided in Appendix J.
In reports, we will not indicate the names and titles of individuals at Project LAUNCH sites who provide information during the site visits or telephone interviews. We also will not identify grantees by name, although vignettes of activities at grantee sites may be included. We will inform all study participants about this during our consent process prior to initiating the interview.
Semi-annual Web-based data reporting
The evaluation will collect information using an on-line data collection system. This system has passed internal audits and tests conducted by the Abt Associates Inc. Security Officer and has been reviewed on multiple occasions by SAMHSA IT staff members. The data collection and analysis modules include https certificate-based authentication and transaction encryption processes approved for HIPAA data transactions on federal healthcare projects. It will not be possible for any individuals who are not part of the research team to access these data. Grantees may access their own data with passwords provided by the cross-site evaluation team.
Large portions of the data grantees will report through the Web-based reporting system are about service delivery and will involve data already being collected by grantees for their local evaluations. All information on services will be reported to the cross-site evaluation in the aggregate; thus no information collected can be linked to individual families or individual agencies. To the extent that the local evaluations involve collecting information on services provided to individual families and children, the cross-site evaluation team will work closely with local grantees and their regional and state Institutional Review Boards (IRBs) to ensure that the necessary IRB approvals are obtained, and that human subject protections are assured. No respondent identifiers will be contained in public use files made available from the cross-site evaluation.
The cross-site evaluation will request information in the aggregate about the demographics of children and families served. These data will be reported to the cross-site evaluation in the aggregate through the Web-based reporting system.
The cross-site evaluation will request information on providers in local sites who participated in Project LAUNCH-related activities. These include trainings attended and mental health consultation services and integration of behavioral health and primary care. Specific questions will ask about how these services have changed their practices in caring for children and families. Although these data will be reported to the cross-site evaluation only in the aggregate through the Web-based reporting system, local program staff may collect it on an individual basis from providers as part of their local evaluations. Each grantee has or will obtain IRB approval from their regional, local and/or state IRB, as well as the institutional IRB overseeing the local grantee evaluation, and therefore will be subject to the terms and guidelines of their specific IRB.
All data collected via the Web-based portal will be stored electronically through Abt’s password protected secure network system. Project directories and databases are protected at Abt Associates by assigned group memberships, passwords, and other techniques (e.g. ACLs), which prohibit access by unauthorized users. In addition to the issue of protection of privacy, data security encompasses backup procedures and other file management techniques to ensure that files are not inadvertently lost or damaged. Project data files are backed up to tape, using fast dump/restore software (backup Exec version 8.5) and DLT-4 tapes that hold up to 70GB of compressed data. The procedures currently utilized at Abt Associates ensure the privacy and security of many Abt Associates research databases.
A11. Sensitive Questions
To achieve the goal of describing the families participating in local Project LAUNCH services, local program staff will collect information from parents that will include some sensitive questions about family and child risk factors. Although this information is sensitive, it is necessary to accurately describe families who receive Project LAUNCH services. The questions employed are from standardized measures or have been used extensively in prior studies with no evidence of harm (for example, in the Fragile Families Study and in the Early Head Start Research Evaluation Project).
The data on individual families and children being collected by the local Project LAUNCH grantees and their local evaluators will be dictated by the guidelines set forth by their IRBs and HIPAA policies to ensure the privacy of sensitive information. As part of the consent process, participating parents will be informed that they might find some questions sensitive and will be asked to sign a consent form to participate, acknowledging that their participation is voluntary. All respondents will be informed that their identity will be kept private and that they do not have to answer questions that make them uncomfortable.
All information collected at the individual child/family level will be de-identified and reported in the aggregate to the cross-site evaluation.
A12. Estimation of Information Collection Burden
Previously Approved Information Collections (OMB Control# 0970-0373)
Total Burden Previously Approved
The approved data collection does not impose a financial burden on respondents nor will respondents incur any expense other than the time spent completing the interviews and entering electronic data for systems and services outcomes.
The previously approved annual burden for study respondents—state, tribal, and local Project LAUNCH program staff and stakeholders—is identified in Table A.4.
To compute the total estimated annual cost, the total burden hours were multiplied by the average hourly wage for each adult participant, according to the Bureau of Labor Statistics, Current Population Survey, 2009. We assumed that most respondents to the site visit and telephone interviews and Web-based data reports (state and local systems data; local services data) would be program directors and program evaluators, and accordingly, used the mean hourly rate for Social Scientists and Related Workers, All Other ($35.31 per hour) to assess the cost of their providing this information.
TABLE A.4
ESTIMATED ANNUAL RESPONSE BURDEN AND ANNUAL COST FOR PREVIOUSLY APPROVED DATA COLLECTION
Instrument |
Annual Number of Respondents |
Number of Responses per Respondent |
Average Burden Hours per Response |
Total Burden Hours |
Estimated Annual Burden Hours |
Average Hourly Wage |
Total Annual Cost |
Site Visit and Telephone Interview Guide |
240 |
1 |
1.25 |
900 |
300 |
$35.31 |
$10,593.00 |
Electronic Data Reporting: Systems Measures |
24 |
2 |
4 |
576 |
192 |
$35.31 |
$6,779.52 |
Electronic Data Reporting: Services Measures |
24 |
2 |
8 |
1152 |
384 |
$35.31 |
$13,559.04 |
Estimated Annual Burden Hours: |
|
876 |
|
$30,931.56 |
Burden Remaining from Previously Approved Information Collection
The estimated remaining annual burden for study respondents from previously approved information collection is identified in Table A.5. The average hourly wage has been updated using Social Scientists and Related Workers, All Other as reported in the 2011 Bureau of Labor Statistics, Occupational Employment Statistics.2
TABLE
A.5
ESTIMATED ANNUAL RESPONSE BURDEN AND COST REMAINING FROM
PREVIOUSLY APPROVED DATA COLLECTION (Cohorts 2 and 3)
Instrument |
Total Number of Respondents |
Number of Responses Per Respondent |
Average Burden Hours Per Response |
Total Burden Hours |
Annual Burden Hours |
Average Hourly Wage |
Total Annual Cost |
Electronic Data Reporting: Systems Measures |
24 |
1 |
4 |
96 |
96
|
$37.82 |
$3630.72 |
Electronic Data Reporting: Services Measures |
24 |
1 |
8 |
192 |
192 |
$37.82 |
$7261.44 |
Estimated Annual Burden Sub-total |
|
288 |
|
$10,892.16 |
Newly Requested Information Collections
The requested data collection does not impose a financial burden on respondents nor will respondents incur any expense other than the time spent completing the interviews.
The estimated annual burden for study respondents—state, tribal, and local Project LAUNCH program staff and stakeholders—is identified in Table A.6. For site visits/telephone interviews and Web-based reporting, these estimates were calculated based on an assumption of 35 Project LAUNCH grantees, with all grantees responding to site visits/telephone interviews and biannual Web-based reporting over a three-year period with Cohort 1 graduating in 2014 and Cohort 2 graduating in 2015. Values for annual burden hours in Table A.6 were calculated as the total burden hours divided by 3 years. Because the number of grantees participating in data collection varies by year, these values represent an average annual burden.
To calculate the total number of respondents in Table A.6, we used the following equation: (35 grantees in 2013 + 29 grantees in 2014 + 17 grantees in 2015) = 81. However, some interviews will not be conducted with all grantees in all years, and therefore the total number of respondents for several respondent categories was calculated as follows:
Chair of the Local Child Wellness Council: These respondents will not be interviewed in the last year of their grant. The total number of respondents is: (29 grantees in 2013 + 17 grantees in 2014 + 11 grantees in 2015) = 57.
Local Stakeholders: These respondents will not be interviewed in the last year of their grant. The total number of respondents is: 3 stakeholders per grantee x (29 grantees in 2013 + 17 grantees in 2014 + 11 grantees in 2015) = 171.
State Child Wellness Coordinator: These respondents will be interviewed in the last year of their grant. However, Cohort 3 grantees and five tribal grantees in Cohort 4 do not have a State Child Wellness Coordinator. Therefore, the total number of respondents in this category is: (24 grantees in 2013 + 18 grantees in 2014 + 6 grantees in 2015) = 48.
Chair of the State Child Wellness Council: These respondents will not be interviewed in the last year of their grant. In addition, Cohort 3 grantees and five tribal grantees in Cohort 4 do not have a State Child Wellness Council. Therefore, the total number of respondents in this category is: (18 grantees in 2013 + 6 grantees in 2014 + 6 grantees in 2015) = 30.
For the outcomes data tables in the End of Year Reports and Web-based reporting, the estimates in Table A.6 were calculated based on an assumption of 81 Project LAUNCH respondents. Web-based reporting occurs twice annually, resulting in 2 responses per respondent.
Total Burden Requested Under this Information Collection
TABLE
A.6
ESTIMATED ANNUAL RESPONSE BURDEN AND COST FOR NEWLY
REQUESTED DATA COLLECTION AND REMAINING PREVIOUSLY APPROVED DATA
COLLECTION
Instrument |
Total Number of Respondents |
Number of Responses Per Respondent |
Average Burden Hours Per Response |
Total Burden Hours |
Annual Burden Hours |
Average Hourly Wage |
Total Annual Cost |
Child Wellness Coordinator Interview Guide |
81 |
1 |
1.5 |
121.5 |
41 |
$37.82 |
$1531.71 |
Chair of Local Child Wellness Council Interview Guide |
57 |
1 |
1 |
57 |
19 |
$37.82 |
$718.58 |
Local Stakeholder Interview Guide |
171 |
1 |
.75 |
128.25 |
43 |
$37.82 |
$1616.80 |
State Child Wellness Coordinator Interview Guide |
48 |
1 |
1.25 |
60 |
20 |
$37.82 |
$756.40 |
Chair of State Child Wellness Council Interview Guide |
30 |
1 |
1.25 |
37.5 |
13 |
$37.82 |
$472.75 |
Electronic Data Reporting: Systems Measures |
81 |
2 |
4 |
648 |
216 |
$37.82 |
$8169.12 |
Electronic Data Reporting: Services Measures |
81 |
2 |
8 |
1296 |
432 |
$37.82 |
$16,338.24 |
Outcomes Data Tables in End of Year Reports |
81 |
1 |
8 |
648 |
216 |
$37.82 |
$8169.12 |
Estimated Annual Burden Sub-total |
|
1,000 |
|
$37,772.72 |
Total Annual Cost
The total annual cost of newly requested and remaining previously approved information collection is $37,772.72, based on 1,000 Annual Burden Hours at the mean hourly wage of $37.82. The mean hourly wage is based on the category of Social Scientists and Related Workers, All Other as reported in the 2011 Bureau of Labor Statistics, Occupational Employment Statistics.2 We chose the labor category of “Social Scientists and Related Workers, All Other” because most respondents are program directors or staff with a Master’s Degree or above.
A13. Cost Burden to Respondents or Record Keepers
There are no additional costs to respondents.
A14. Estimate of Cost to the Federal Government
The total cost for the data collection activities under this current request will be $1,223,589.38 over three years. This amount includes costs for new data collection activities under this request and the remaining costs from previously approved collections still in progress. Annual costs to the federal government will be $407,863.13 for the proposed data collection under this OMB clearance (#0970-0373).
A15. Change in Burden
Increased Annual Burden
The annual time burden increases for two reasons: 1) the addition of a new instrument, the Outcomes Data Tables in End of Year Reports—there are 216 Annual Burden Hours associated with this new instrument; and 2) a slight increase (from 576 to 648) in the total Annual Burden Hours for the two Electronic Data Reporting instruments because of the increase from 24 to 27 in the annual number of grantees using these two sets of instruments.
Decreased Annual Number of Responses
The annual number of responses goes down for two reasons: 1) For telephone and site visit interviews, the number of individuals interviewed at each grantee site has been reduced from 10 to 7; and 2) the number of grantees over the next three years decreases as the grant ends for some cohorts. Although the number of grantees in the first year for which we seek OMB approval (2013) is higher than the number of grantees for which data collection was approved in 2011 (29 vs. 24), the total number of grantees decreases to 17 in 2014 and 11 in 2015. This sharp reduction contributes to the annual number of responses going down. In addition, although a new instrument – Outcomes Data Tables in End of Year Reports -- has been added, the 27 annual number of responses for this instrument is not large enough to offset the overall reductions in annual responses due to the reasons just described.
A16. Plan and Time Schedule for Information Collection, Tabulation and Publication
The schedule for fielding, analyzing, and reporting the data findings for the cross-site evaluation of Project LAUNCH is as follows:
Site Visits/Telephone Interviews Fall 2013, 2014, 2015
Semi-Annual Web-based Reporting Spring/Fall 2010– 2016
Outcome Data Tables in End of Year Reports Fall 2013, 2014, 2015
Data Analysis Fall 2010 – Summer 2016
Data Reports Annual
To date, a report on the cross-site evaluation design has been published, and two annual reports are currently in review. In these and future annual reports, Abt will prepare tables summarizing key data from the site visits/telephone interviews and the Web portal. This information will be used in short briefs and presentations requested by ACF/SAMHSA; these may include brief reports for the project website, issue briefs, or peer-reviewed journal articles.
A final cross-site evaluation report will be produced at the end of the contract period. The final report will include a description of data collection and analysis methods, and a summary of findings.
A17. Reasons Not to Display OMB Expiration Date
All instruments will display the expiration date for OMB approval.
A18. Exceptions to Certification for Paperwork Reduction Act Submissions
No exceptions are necessary for this information collection.
References
1 Institute of Education Sciences. What Works Clearinghouse. Washington, DC: U.S. Department of Education. Accessed on January 23, 2013: http://ies.ed.gov/ncee/wwc/
2 Bureau of Labor Statistics. Occupational Employment Statistics. Washington, DC: U.S. Department of Labor. Accessed on January 23, 2013: http://www.bls.gov/oes/current/oes193099.htm
1 This concept of strength of evidence underlies the ratings of studies produced by the What Works Clearinghouse (WWC).1 However, the WWC standards for strength of evidence are relevant only for a certain set of designs, specifically, random assignment designs, quasi-experimental designs, regression discontinuity designs, and single subject designs. Nearly all of the outcome studies being conducted as part of the local LAUNCH evaluations will employ designs that do not include a comparison group and may or may not include pre- and post-measures of the same respondents.
2 All Electronic data reporting is the same as approved in the previous OMB package (OMB Control# 0970-0373).
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | OPRE OMB Clearance Manual |
Author | DHHS |
File Modified | 0000-00-00 |
File Created | 2021-01-29 |