Evaluation of the Child Welfare Capacity Building Collaborative
OMB Information Collection Request
0970-0576
Supporting Statement
Part A
May 2022
Submitted By:
Children’s Bureau
Administration on Children, Youth and Families
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Executive Summary
Type of Request: This Information Collection Request (ICR) is for a revision to add additional instruments to this information collection. We are requesting an additional three years of approval.
Progress to Date: The current information collection (OMB #0970-0576) continues work of the Child Welfare Capacity Building Collaborative and builds on three related ICRs: 0970-0484, 0970-0494, and 0970-0501). For information about the relationship of the different requests, see the initial request for approval submitted under this OMB #.
The initial ICR included thirty instruments to support evaluation activities of the Cross-Center and Center-specific evaluations Child Welfare Capacity Building Collaborative. These instruments are currently in-use. During the first seven months of data collection for this evaluation, the study team has identified additional information that is necessary for a full understanding of the functioning and outcomes of the Child Welfare Capacity Building Collaborative. These additional data collection efforts were conceptualized and planned as a result of what has been learned. They represent a second part of the data collection activities already underway for this study.
Description of Request: Six new instruments are proposed to support the evaluation of the activities of the Children's Bureau Child Welfare Capacity Building Collaborative, which comprises three federally funded Centers (Center for States [CBCS], Center for Tribes [CBCT], and Center for Courts [CBCC]). The Centers deliver national expertise and evidence-informed training and technical assistance (TA) services to state, tribal, and territorial public child welfare agencies and Court Improvement Programs (CIPs). The Centers are being evaluated through a Cross-Center evaluation and three Center-specific evaluations. The evaluations use both surveys and interviews to gather data to understand the services delivered by the Centers, the utilization of services by jurisdictions, the quality of and satisfaction with services, collaboration among the Centers, and service outcomes. The six new instruments include four instruments for the Cross-Center evaluation, one instrument for the CBCT evaluation, and one instrument for the CBCC evaluation. There are no new instruments proposed for the CBCS evaluation in this ICR.
We do not intend for this information to be used as the principal basis for public policy decisions.
A1. Necessity for Collection
The Evaluation of the Child Welfare Capacity Building Collaborative is sponsored by the Children’s Bureau (CB) in the Administration on Children, Youth and Families (ACYF), Administration for Children and Families (ACF), U.S. Department of Health and Human Services (HHS). Evaluation activities have been underway since initially approved in September 2021. Through these initial activities, the study team has identified additional information that is necessary for a full understanding of the functioning and outcomes of the Child Welfare Capacity Building Collaborative. As a result, CB currently seeks approval for six additional data collection efforts, which were conceptualized and planned as a result of what has been learned. They represent a second part of the data collection activities already underway for this study.
These additional information collection activities support the approved data collections under this OMB number and are necessary to reach the overarching goals of the evaluation: to track, monitor, and evaluate the activities of the Collaborative which includes three federally funded Centers (Center for States [CBCS], Center for Tribes [CBCT], and Center for Courts [CBCC]) that deliver national child welfare expertise and evidence-informed training and technical assistance (TA) services to state, tribal, and territorial public child welfare agencies and Court Improvement Programs (CIPs) [henceforth referred to as jurisdictions]. The collective goal of the Centers is to build the capacities of jurisdictions to successfully undertake practice, organizational, and systemic reforms necessary to implement federal policies, meet federal standards, and achieve better outcomes for the children, youth, and families they serve.
Legislative Background and Purpose
Agencies that receive formula funding through the Child Abuse Prevention and Treatment Act (CAPTA), and titles IV-B and IV-E of the Social Security Act are eligible for TA from CB to support implementation of these programs, compliance with federal requirements, and improvement of outcomes. This information collection is necessary to perform routine evaluation of quality and effectiveness and to inform future planning and decision making about the provision and improvement of TA services authorized under multiple sections of CAPTA and titles IV-B and IV-E of the Social Security Act. This information collection also complies with the statutory requirement for training projects authorized by Section 5106 of CAPTA to be evaluated for their effectiveness. A copy of the relevant sections of CAPTA as Amended by P.L. 115-424 (Victims of Child Abuse Act Reauthorization Act of 2018), the CAPTA Reauthorization Act of 2010 can be found in Appendix 1.
A2. Purpose
Purpose and Use
The Centers’ services are organized into three major categories: (1) universal - product development and information dissemination, including the creation and release of website content, publications, and other resources; (2) constituency/targeted services - training and peer networking, including the delivery of online courses, virtual presentations, and facilitated peer discussions; and (3) tailored services - jurisdiction-specific consultation and coaching, including workshops and onsite visits to provide customized support.1 Each service category is designed to achieve specific outcomes that require different levels of engagement and interaction between the Centers and their service recipients.
Data collected through this information collection are being used by the Centers and CB for continuous quality improvement (CQI) to improve the development and delivery of the Centers' services and to assess the impact of services on jurisdictions' ability to achieve their intended outcomes. Data collected through the proposed six additional instrument will also be used for CQI purposes. Specifically, the four additional instruments proposed for the Cross-Center evaluation will be used to better understand and potentially improve the process and effectiveness of tailored services, including the use of TA Liaisons (also known as Child Welfare Specialists) to deliver TA, and how Center TA services address diversity, equity, and inclusion. The one additional instrument proposed for the CBCT evaluation will be used to obtain additional information about outcomes of Center services and service recipient satisfaction with services that is not available from the instruments previously approved under this OMB number. The one additional instrument proposed for the CBCC evaluation will be used to assess one specific type of constituency/targeted service, The Center for Courts Academy, for the purposes of program quality improvement; the information collected through this new instrument is not available from the previously approved instruments used for the CBCC evaluation.
Evaluation findings will help to inform future decision making about service delivery and federal resource allocation. Evaluation findings also will be shared with other providers and service recipients to increase knowledge about TA strategies and approaches. Consistent with this approach, CB recently released findings from its 2014 -2019 evaluations of the Collaborative to the public on its webpage, (see the Cross-Center evaluation final public report, Building Capacity in Child Welfare: Findings From a Five-Year Evaluation of the Capacity Building Collaborative – Report, and the CBCS final evaluation public report, Making a Difference for Public Child Welfare Agencies: Key Findings from the Final Evaluation Report – Years 2015 – 2019). Moreover, the design for the 2014 – 2019 Cross-Center evaluation was shared with other federal agencies and departments that fund TA systems.
The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker and is not expected to meet the threshold of influential or highly influential scientific information.
Guiding Research Questions
The Centers’ services are being evaluated by the following evaluations:
Cross-Center evaluation
Center-specific evaluations:
The Cross-Center evaluation focuses on assessing the satisfaction and effectiveness of Centers' tailored services, while the Center-specific evaluations predominately focus on assessing universal and constituency services, as well as collecting formative data on tailored services for continuous quality improvement. The Cross-Center evaluation and Center evaluations are designed to respond to a set of evaluation questions posed by CB. Data collected will address these evaluation questions. The research questions that guide the Cross-Center evaluation and Center-specific evaluations are provided in Appendix 2. In general, the evaluations are designed to understand the services delivered by Centers; utilization of services by jurisdictions; quality of and satisfaction with services; collaboration among Centers and with federal staff; and service outcomes, such as completion of implementation milestones, improvements in capacity, and changes in child welfare practice.
Study Designs
The proposed Cross-Center and Center-specific evaluations use mixed-methods, longitudinal approaches to respond to the evaluations' guiding research questions. The approaches combine online and paper-based surveys, interviews, and document review to assess the services delivered by Centers; the use of services by jurisdictions; the quality of and satisfaction with services; collaboration among Centers, and service outcomes. The study designs are appropriate for their intended purposes as they were developed to answer the descriptive questions by utilizing methods and measures that assess the Centers' core components and outcomes (see B1 in Supporting Statement Part B for additional detail).
The specific instruments used to gather data for the Cross-Center and Center-specific evaluations – including the six new proposed instruments – are outlined in table A-1 below (see Instruments 1-36).
The new instruments proposed through this ICR include the following:
Instrument 31: Cross-Center – Tailored Services Focus Group Guide (for states)
Instrument 32: Cross-Center – Tailored Services Focus Group Guide (for CIPs)
Instrument 33: Cross-Center – Liaison/Child Welfare Specialist Interview Protocol
Instrument 34: Cross-Center – Tailored Services Jurisdiction Staff DEI Interview Protocol
Instrument 35: CBCT – Tribal Child Welfare Staff Interview/Focus Group Guide
Instrument 36: CBCC – CIP Capacity Building Services Feedback Survey
More information about all of the instruments and the overall data collection is available in section B4 of Supporting Statement B.
Table A-1. Instrument Description and Administration Details
Instrument |
Respondent, Content, Purpose of Collection |
Mode and Frequency |
Cross-Center Evaluation |
||
Outcomes of and Satisfaction with Tailored Services Survey – intensive projects |
Respondents: State and tribal child welfare staff and CIP staff receiving intensive services Content: Questions about change management knowledge and skills; changes in capacity; satisfaction with Center services; and (for state respondents) questions about the CBCS practice model Purpose: To measure child welfare staff perceptions of the outcomes on intensive courses of tailored services and their satisfaction with those services Personally Identifying Information (PII): n/a |
Mode: Online via Qualtrics Frequency: Once at the close of each intensive service project |
Outcomes of Tailored Services Survey – brief projects |
Respondents: Tribal child welfare jurisdiction staff and CIP staff receiving brief services Content: Questions about outcomes of brief services Purpose: To assess child welfare staff perceptions of the outcomes of brief courses of tailored services PII: n/a |
Mode: Online via Qualtrics Frequency: Once at the close of each CBCC and CBCT brief service project |
Leadership Interview – states and territories |
Respondents: State child welfare directors Content: Questions about the agency’s experiences with assessment and work planning; working with the CBCS; and services received and progress toward outcomes Purpose: To obtain information from state child welfare directors regarding factors that influence their decisions to engage in services with the CBCS; perceptions of the capacity building services received; and satisfaction with the Center’s services PII: role, years in in role, years at organization |
Mode: By phone Frequency: Twice: once each in project years 2 and 4 |
Leadership Interview – CIPs |
Respondents: CIP directors Content: Questions about the CIP’s experiences with assessment and work planning; working with the CBCC; and services received and progress toward outcomes Purpose: To obtain information from CIP directors regarding factors that influence their decisions to engage in services with the CBCC; perceptions of the capacity building services received; and satisfaction with the Center’s services PII: role, years in in role, years at organization |
Mode: By phone Frequency: Twice: once each in project years 2 and 4 |
Leadership Interview – tribes |
Respondents: Tribal child welfare agency directors Content: Questions about the tribe’s experiences with assessment and work planning; working with the CBCT; services received and progress toward outcomes; and satisfaction with services Purpose: To obtain information from tribal child welfare directors regarding factors that influence their decisions to engage in services with the CBCT; perceptions of the capacity building services received; and satisfaction with the Center’s services PII: role, years in in role, years at organization |
Mode: By phone Frequency: Twice: once each in project years 2 and 4 |
Collaboration and Communication Survey |
Respondents: Center staff and federal partners Content: Questions about collaboration and communication across Centers, and collaboration with federal staff. Purpose: To understand the extent to which factors that support collaboration among Centers and with federal staff exist and whether they improve over time PII: role, years with organization, Center service type, percentage time in role |
Mode: Online via Qualtrics Frequency: Twice: once each in project years 2 and 4 |
Collaborative Project Team Survey |
Respondents: Center and federal staff Content: Questions about collaboration among members of a team Purpose: To understand whether collaborative teams for specific projects and/or communication teams exhibit signs of healthy collaboration PII: role, years with Center, Center service type, percentage time in role, length of time on collaborative team |
Mode: Online via Qualtrics Frequency: Once per selected collaborative team |
Tailored Services Team Focus Group Guide (for states) |
Respondents: Select teams of state child welfare staff who received tailored services from the Center for States Content: Questions about use of the change management approach and the effectiveness of tailored services Purpose: To assess the process and effectiveness of tailored services among those receiving services from the Capacity Building Collaborative PII: n/a |
Mode: facilitated group discussion conducted via videoconference (e.g., Zoom) Frequency: Once at the close of selected tailored service projects |
Tailored Services Team Focus Group Guide (for CIPs) |
Respondents: Teams of CIP staff who received tailored services from the Center for Courts Content: Questions about use of the change management approach and the effectiveness of tailored services Purpose: To assess the process and effectiveness of tailored services among those receiving services from the Capacity Building Collaborative PII: n/a |
Mode: facilitated group discussion conducted via videoconference (e.g., Zoom) Frequency: Once at the close of selected tailored service projects |
Liaison/Child Welfare Specialist Interview Protocol |
Respondents: All Center service providers, known as Liaisons or Child Welfare Specialists Content: Questions about the role and experiences of Center Liaisons/Child Welfare Specialists and the Centers’ approach to providing services to increase child welfare jurisdiction and CIP’s capacity on issues related to diversity, equity, and inclusion Purpose: To obtain information from all Center service providers about how they function to support Center service delivery PII: years in in role, years working in child welfare, level of education |
Mode: Via videoconference (e.g., Zoom) Frequency: Once per Liaison/Child Welfare Specialist |
Tailored Services Jurisdiction Staff DEI Interview Protocol |
Respondents: Select teams of state and tribal child welfare staff and CIP staff who received tailored services from the Center for States, Center for Tribes, or Center for Courts Content: Questions about the how Center services supported tailored service projects that included a focus on increasing capacity to address DEI; the degree to which the services contributed to increased capacity to address DEI; perceptions of service provider competency in addressing DEI; suggestions for improving services and support in this area. Purpose: To obtain information about how the Child Welfare Capacity Building Collaborative’s technical assistance services address diversity, equity, and inclusion PII: n/a |
Mode: Via videoconference (e.g., Zoom) Frequency: Once per respondent |
Center for States (CBCS) Evaluation |
||
Event Registration |
Respondents: Child welfare professionals Content: Demographic questions Purpose: To register child welfare professionals for participation in CBCS-hosted events and understand audience reach for CBCS services PII: Name, jurisdiction, organization, email, role, highest education degree, years of experience in child welfare |
Mode: Online via virtual event platform (e.g., Adobe Connect) or survey platform (e.g., Qualtrics) Frequency: Ongoing for each CBCS-hosted event |
Brief Event Survey |
Respondents: Child welfare professionals Content: Questions about satisfaction with and outcomes of CBCS-hosted universal services events and peer events Purpose: To gather feedback that can inform program planning PII: n/a |
Mode: Online via Qualtrics Frequency: Once at the end of each CBCS-hosted universal service or peer event |
Event Follow-up Survey |
Respondents: Child welfare professionals Content: Questions about outcomes of CBCS-hosted universal events and peer events Purpose: To gather feedback that can inform program planning PII: n/a |
Mode: Online via Qualtrics Frequency: Once, three months after the CBCS-hosted universal service or peer event |
Event Poll |
Respondents: Child welfare professionals Content: Questions about satisfaction with and outcomes of CBCS-hosted peer events with less than 100 registrants Purpose: To gather feedback that can inform program planning PII: n/a |
Mode: Online via Adobe Connect or WebEx Frequency: Once at the end of the event |
Peer Learning Group Survey |
Respondents: Child welfare professionals Content: Questions about satisfaction with and outcomes of CBCS-hosted peer learning groups Purpose: To gather feedback that can inform program planning PII: n/a |
Mode: Online via Qualtrics Frequency: Bi-annually |
Learning Experience Satisfaction Survey |
Respondents: Child welfare professionals Content: Questions about satisfaction with and outcomes of CBCS-hosted peer learning groups Purpose: To gather feedback that can inform program planning PII: n/a |
Mode: Online via Qualtrics Frequency: For single events, once at the end of the event. |
Jurisdiction Interview Protocol |
Respondents: Select state child welfare agency staff who serve as intensive service project leads Content: Questions about agency staff experience with receiving intensive services from CBCS Purpose: To gather feedback on working with the CBCS in general and/or on a specific service/set of services PII: n/a |
Mode: Video or conference call via Microsoft Teams Frequency: Once at the end of an intensive project’s annual workplan |
Tailored Services Brief Project Survey |
Respondents: State child welfare agency program staff receiving brief services Content: Questions about outcomes of and satisfaction with brief services Purpose: To assess child welfare staff perceptions of the outcomes of and their satisfaction with brief tailored services PII: n/a |
Mode: Online via Qualtrics Frequency: Once at the close of each brief project’s annual workplan |
Peer-to-Peer Event Survey |
Respondents: Child welfare professionals Content: Questions about satisfaction with and outcomes of peer-to-peer events hosted by CBCS Purpose: To gather feedback that can inform project planning PII: n/a |
Mode: Online via Qualtrics Frequency: Once at end of each peer-to-peer event |
Longitudinal Ethnographic Substudy Jurisdiction Interview |
Respondents: Child welfare agency program staff from tailored services intensive projects selected for the substudy Content: Questions about agency staff experience with receiving intensive services from CBCS Purpose: This interview protocol is the sole data source used for a longitudinal study of several CBCS intensive projects. The purpose of this study is to improve understanding of the entire lifecycle of such projects and of how various factors influence project progress. PII: n/a |
Mode: Video call via Microsoft Teams Frequency: Twice per year |
Center for Tribes (CBCT) Evaluation |
||
Request for Services Form |
Respondents: Tribal child welfare agency representative Content: Questions about the tribal program requesting services, the purpose of the request, and eligibility for services Purpose: To enable tribal child welfare programs to request services from CBCT PII: Name, jurisdiction, organization, email, phone number, role |
Mode: By phone Frequency: Once per service request |
Inquiry Form |
Respondents: Tribal child welfare agency representative Content: Questions about the tribe’s contact information, purpose of the service request, and eligibility for services Purpose: To collect preliminary information on what services a tribal child welfare program is requesting from CBCT so the request can be passed along for further follow-up by the appropriate Center staff person. PII: Name, jurisdiction, organization, email, phone number, role |
Mode: By phone or email Frequency: Once per service request |
Tribal Demographic Survey |
Respondents: Tribal child welfare agency representative Content: Questions about the tribal child welfare program, including its services, its Title IV-E status, its population and location, staffing, and its data management capacities Purpose: To collect information to better understanding of the status of a tribal child welfare program PII: Name, jurisdiction, organization, email, phone number, role |
Mode: Verbally or electronically Frequency: Once per service request |
Needs and Fit Exploration Tool Phase 1 |
Respondents: Tribal child welfare agency representative(s) Content: Questions about the tribal child welfare agency’s service request; strengths and challenges; Title IV-B plan; child welfare program services and structure; funding; staffing and workforce, collaboration with courts, community, and the state; data management; and other sources of TA Purpose: To gather additional information to help CBCT decide if the tribal inquiry and request for services fit the criteria for CBCT TA. PII: Name, jurisdiction, organization, email, phone number |
Mode: By phone or videoconference Frequency: Once per service request |
Needs and Fit Exploration Tool Phase 2 (Process Narrative) |
Respondents: Tribal child welfare agency representative(s) Content: A structured interview protocol that facilitates conversation about the tribal child welfare program’s strengths and challenges; Title IV-B plan; program services; funding; and staffing and workforce. The protocol also enables CBCT staff to review case flow and to review the evaluation components of the proposed services with the program’s representatives. Purpose: To facilitate an onsite discussion with a tribal child welfare agency to build relationships, learn more about how their program operates, and assess the program’s needs and capacity PII: Name, jurisdiction, organization, email, phone number |
Mode: In person or virtually Frequency: Once per service request |
Tribal Child Welfare Leadership Academy Pre-Training Self-Assessment |
Respondents: Child welfare professionals Content: Questions about child welfare professionals’ competencies and leadership qualities Purpose: To provide a baseline (i.e., pre-training) measure of competencies that are the focus of the Leadership Academy PII: Name, jurisdiction, email |
Mode: Online via Qualtrics Frequency: Once per Academy participant |
Tribal Child Welfare Leadership Academy Post-Training Self-Assessment |
Respondents: Child welfare professionals Content: Questions about child welfare professionals’ competencies and leadership qualities, and their satisfaction with the training Purpose: To provide a post-training measure of competencies that were the focus of the Leadership Academy, and information about attendees’ satisfaction with the training PII: Name, jurisdiction, email |
Mode: Online via Qualtrics Frequency: Once per Academy participant |
Universal Services Webinar Feedback Survey |
Respondents: Child welfare professionals and stakeholders Content: Questions about satisfaction with the webinar Purpose: To measure participant satisfaction with the content and flow of CBCT-sponsored universal services webinars PII: Name, jurisdiction, email
|
Mode: Online via Qualtrics Frequency: Once following each universal services webinar |
Tribal Child Welfare Jurisdiction Staff Interview/Focus Group Guide |
Respondents: Tribal child welfare program staff who received tailored services from the Center for Tribes Content: Questions about program staff’s experience receiving tailored services from the Center for Tribes; suggestions for improvement of Center services; outcomes of the services provided; challenges to project completion that the Center could provide future support for; suggestions for other services the Center could provide Purpose: To obtain information about outcomes of Center services and service recipient satisfaction with Center services for the purposes of service improvement PII: n/a |
Mode: In person or via videoconference Frequency: Twice per respondent when data are collected via interview, three times per respondent when data are collected via focus group |
Center for Courts (CBCC) Evaluation |
||
CQI Workshop Feedback Survey |
Respondents: Child welfare and court professionals Content: Questions about satisfaction with the workshop and understanding of the topics covered Purpose: To assess the usefulness of the workshop, participant satisfaction, and perceived knowledge gain, to help the Center make adjustments to improve future workshops PII: n/a |
Mode: Paper survey or online survey (for a Virtual Academy) Frequency: Once at the end of a CQI workshop |
Academy Feedback Survey |
Respondents: Child welfare and court professionals Content: Questions about participant satisfaction with the Academy and whether participant experienced gains in knowledge Purpose: To assess participant satisfaction with and perceived knowledge gain from the CBCC Judicial and Attorney Academies, to inform improvement of future Academies PII: n/a |
Mode: Paper survey or online survey (for a Virtual Academy) Frequency: Once at the end of the Academy training |
Pre/Post Academy Learning Assessment |
Respondents: Child welfare and court professionals Content: Questions that assess knowledge of legal and judicial issues Purpose: To gauge Academy participants’ knowledge and then provide exposure to material tailored to that knowledge PII: Name, jurisdiction, email |
Mode: Online module Frequency: Once at the end of the Academy training |
Court Improvement Program Capacity Building Feedback Survey |
Respondents: Court Improvement Program directors/coordinators Content: Questions about experiences with and satisfaction with the capacity building services delivered by the Center for Courts, and questions about the perceived impact in CIP capacity Purpose: To assess the usefulness of Center for Courts’ Academies and participant satisfaction with the Academies for the purposes of program improvement PII: n/a |
Mode: Online via Survey Monkey Frequency: Twice per respondent |
Other Data Sources and Uses of Information
This ICR builds off two prior Cross-Center requests which were part of the 2014-2019 evaluations of the Collaborative, OMB Number 0970-0484 (exp. 11/30/2022) and OMB Number 0970-0494 (exp. 2/28/2023), and one prior Center-specific request for CBCS, OMB Number 0970-0501 (exp. 9/30/2023). The content of most instruments contained in these earlier ICRs were revised for this ICR to reflect what was learned in the 2014 – 2019 evaluations and to address CB's current evaluation objectives. When applicable, data obtained from the earlier collections will be used to assess longitudinal changes related to the delivery of services, quality and satisfaction, and outcomes.
The evaluations also rely on data from the Centers' online data system, CapTRACK, for recording service delivery and outcomes, and limited information on jurisdiction needs. Centers capture in CapTRACK information about the products, events, and learning experiences they develop. Centers also record information on the tailored services they provide, including the service strategies, frequency, modality, topic, and hours of service, and the expected and actualized outcomes of services. CapTRACK contains limited information on the needs of jurisdictions, which is obtained from assessments that were approved through Center-specific OMB requests (OMB Number 0970-0501 for the CBCS Evaluation Ancillary Data Collection, and OMB Number 0970-0307 for the CBCC Self-Assessment for the CIPs).
This ICR is intended to meet the needs of the Cross-Center evaluation and three Center-specific evaluations, all related to assessing CB's Capacity Building Collaborative.
A3. Use of Information Technology to Reduce Burden
Wherever possible and appropriate, information technology is being used to capture information and reduce burden relative to alternative methods of data collection. Administration of most evaluation surveys are web-based, utilizing email notification and Internet-based survey technologies creating efficiencies for survey administrators, allowing flexibility and convenience for recipients, and ideally resulting in a user-friendly experience for respondents. This is the case for the one newly proposed survey in this request, the Court Improvement Program Capacity Building Feedback Survey. Based on the services provided, survey respondents receive an email notification inviting them to complete the appropriate survey instrument by accessing a web-link to an online survey. Respondents may be invited to participate in a survey immediately following the conclusion of a service (e.g., webinar, online training, peer event) via live polling through integrated technology platforms (e.g., Adobe Connect, WebEx) or via embedded survey links. Nearly all targeted respondents are expected to be able to access the web-link or online surveys.2 Most survey questions include closed-ended response items that can be completed quickly (within 10 – 15 minutes), allowing descriptive and comparative analyses.
Data collection also includes interviews/focus groups conducted via telephone or in person. With the permission of respondents, telephone interviews are audio recorded and transcribed, to maximize detailed and accurate notes and to minimize the need to go back to informants to clarify what was said. This will be the approach taken for the interviews and focus groups to be conducted using the newly proposed data collection protocols, the Tailored Services Team Focus Group Guides (versions for use with states and CIPS), the Liaison/Child Welfare Specialist Interview Protocol, the Tailored Services Jurisdiction Staff DEI Protocol, and the Tribal Child Welfare Jurisdiction Staff Interview/Focus Group Guide.
A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency
As with the currently approved instruments, the additional instruments proposed to be added to this information collection are intended to uniformly collect data that will allow for the evaluation of Center-specific processes and outcomes and to answer a set of cross-cutting evaluation questions posed by CB. CB has required the Cross-Center and Center-specific evaluators to ensure data collection is necessary and complementary. The existing information collection and Center-specific evaluation activities as well as the development of the six additional instruments have been coordinated to avoid potential duplication and to reduce burden to respondents. Each of the three Centers have met with the Cross-Center evaluation team and reviewed the Cross-Center data collection instruments, including the four newly proposed instruments. The instruments have been revised to address potential overlap and the timing of data collection activities will be closely coordinated to minimize burden. When applicable, the Cross-Center and Center-specific evaluators will share data as established by written data sharing agreements (see Supporting Statement B, section B7). While Center-specific data will yield important and relevant information, it will not be sufficient to meet the Cross-Center purposes for the proposed information collection.
A5. Impact on Small Businesses
No small businesses will be involved with this information collection.
A6. Consequences of Less Frequent Collection
To improve the Center’s services and collaborate effectively to provide coordinated support to state, tribal, and territorial public child welfare agencies and CIPs, CB and its providers need timely data on the provision of services delivered by the Centers, the accessibility of services, the perceived effect and quality of the services received, and the interactions of service providers with one another. Less frequent data collection would inhibit the timely use of the information by CB and providers to improve service coordination and service quality and to potentially make decisions about service delivery.
A7. Now subsumed under 2(b) above and 10 (below)
A8. Consultation
Federal Register Notice and Comments
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and OMB regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published notices in the Federal Register announcing the agency’s intention to request an OMB review of these information collection activities. The first notice, for instruments 1-30, was published on March 19, 2021 (86 FR 14930) and provided a sixty-day period for public comment. A subsequent notice for the initial request was published on May 26, 2021 (86 FR 28360) and provided a thirty-day period for public comment. During the notice and comment periods, no comments were received. A notice was published on February 4, 2022, Volume 87, Number 24, pages 6,566-6,567, for the second set of instruments (numbers 31-37) and provided a sixty-day period for public comment. No comments were received. A subsequent notice will be published at the time of submission to OMB, providing a thirty-day period for public comment on this requestion to add instruments.
Numerous opportunities were provided for direct stakeholders to review the proposed instruments and to contribute to their development throughout the evaluation design phase. The Cross-Center and Center evaluation teams were responsive to stakeholders’ comments whenever possible and used their feedback in revising the data collection instruments. In preparation of the OMB clearance package, instruments were pilot tested with fewer than 10 individuals who were knowledgeable of the topics addressed and who had served in positions similar to the potential respondents (i.e., state/tribal Child Welfare Directors; CIP Directors; current and former Center staff members, consultants, and liaisons). Following stakeholder review and pilot testing, revisions were made to instruments based on comments to improve clarity of instructions and items and, in some cases, to shorten instruments.
No external experts outside of the study were consulted.
A9. Tokens of Appreciation
No tokens of appreciation are proposed for this information collection request.
A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing
Personally Identifiable Information
As previously described, the Cross-Center and Center-specific evaluations collect PII on instruments, as identified in table A-1. PII data elements include name, jurisdiction, organization, email, phone number, position/role, highest educational degree, length of time in role, length of time with organization, years of experience in child welfare, and percentage of time in role. Telephone interviews will be audio recorded with respondent consent and transcribed to ensure accuracy. All PII obtained during the interview will be removed from the transcripts and the audio recordings will be deleted after transcription (see Supporting Statement B, section B4). In general, PII is collected to support survey administration and interviews, and to describe respondent characteristics. Some PII data fields, such as type of jurisdiction and role, will be used in analyses to explore variations in findings.
All PII collected by the three Centers and Cross-Center evaluation team will be kept private and kept secure. Only select data such as jurisdiction and professional role will be shared with other Center or Cross-Center evaluators. Only the evaluation teams will have access to identifiers such as contact name and email address for purposes of data collection. Cross-Center and Center-specific evaluators will store all PII contact data in separate files on their respective servers (or SharePoint sites) in password protected, secure data systems to ensure privacy. Data collected will be coded using identification numbers, and links between identification numbers and names will be stored in password protected, encrypted files. Identifiers will not be used in any evaluation reporting.
Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individuals’ personal identifier.
Assurances of Privacy
Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. As specified in the contract, the Contractor will comply with all Federal and Departmental regulations for private information.
Data Security and Monitoring
Each Contractor has developed a Data Safety and Monitoring Plan that assesses all protections of respondents’ PII. The Contractors ensure that all its employees, subcontractors (at all tiers), and employees of each subcontractor, who perform work under this contract/subcontract, are trained on data privacy issues and comply with the above requirements.
As specified in the evaluators' contracts, the Contractors shall use Federal Information Processing Standard compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. Contractors shall securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Processing Standard. The Contractors shall: ensure that this standard is incorporated into the Contractors’ property management/control system; establish procedures to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically will be secured in accordance with the most current National Institute of Standards and Technology (NIST) requirements and other applicable Federal and Departmental regulations. In addition, the Contractors must submit a plan for minimizing to the extent possible the inclusion of sensitive information on paper records and for the protection of any paper records, field notes, or other documents that contain sensitive or PII that ensures secure storage and limits on access.
A11. Sensitive Information 3
No questions of a sensitive nature are included in these evaluations.
A12. Burden
Explanation of Burden Estimates
Table A-2 includes the estimates of response burden by instrument for the Cross-Center and Center evaluations. The total annual response burden for all instruments is estimated to be 120 hours for the Cross-Center evaluation, 419 hours for the CBCS evaluation, 379 hours for the CBCT evaluation, 117 hours for the CBCC evaluation, for a total of 1,035 hours for all evaluations. For instruments that were similar to those administered as part of the 2014 – 2019 Evaluation of the Collaborative (OMB Numbers 0970-0484 and 0970-0494), estimates of the total number of respondents were based on historical data. Estimates of the average burden per response were based on the prior completion times of similar instruments and/or pilot tests conducted with fewer than 10 respondents as part of the process of instrument development and refinement.
Estimated Annualized Cost to Respondents
After applying hourly wage estimates to burden hours in each respondent category, the current annual cost to the respondents is as follows: (1) $10,168.00 for the Cross-Center evaluation; (2) $17,732.08 for the CBCS evaluation; (3) $$16,039.28 for the CBCT evaluation; and (4) $8,054.28 for CBCC evaluation. The total annual cost to the respondents if all data collection instruments were employed in the same given year4 is $51,993.64. This cost information is based on the most current data available from May 2019. For labor categories, the mean hourly wage for “Social Scientists and Related Workers” ($42.32) was used for respondents completing the agency surveys; “Lawyers, and Judicial Law Clerks” ($68.84) was used for respondents completing the evaluation surveys for CBCC; “Management Positions, Chief Executives” ($93.20) was used for those participating in the Leadership Interviews; and the category “Operations Specialties Managers” ($64.69) was used for respondents completing the Collaboration Surveys. Labor categories and wage information was obtained from the following website: https://www.bls.gov/oes/current/oes_nat.htm#00-0000,%2021-0000,%20Community%20and%20Social%20Service%20Occupations%20mean%20hourly%20=%2021.79
Table A-2. Estimates of Annualized Burden Hours and Costs
Instrument |
No. of Respondents (total over request period) |
No. of Responses per Respondent (total over request period) |
Avg. Burden per Response (in hours) |
Total Burden (in hours) |
Annual Burden (in hours) |
Average Hourly Wage Rate |
Total Annual Respondent Cost |
Cross-Center Evaluation |
|||||||
Outcomes of and Satisfaction with Tailored Services Survey (Intensive projects) - team lead's completion |
120 |
1 |
0.25 |
30 |
10 |
$42.32 |
$423.20 |
Outcomes of Tailored Services Survey (Brief projects) |
150 |
1 |
0.05 |
8 |
3 |
$42.32 |
$126.96 |
Leadership Interview – States and Territories |
43 |
2 |
.66 |
57 |
19 |
$93.20 |
$1,770.80 |
Leadership Interview – CIPs |
37 |
2 |
.66 |
49 |
16 |
$93.20 |
$1,491.20 |
Leadership Interview – Tribes |
14 |
2 |
.75 |
21 |
7 |
$93.20 |
$652.40 |
Collaboration and Communication Survey – Center staff |
100 |
2 |
0.22 |
44 |
15 |
$64.69 |
$970.35 |
Collaboration Project Team Survey |
120 |
1 |
0.23 |
28 |
9 |
$64.69 |
$582.21 |
Tailored Services Team Focus Group Guide (for states) |
50 |
1 |
1 |
50 |
17 |
$42.32 |
$719.44 |
Tailored Services Team Focus Group Guide (for CIPs) |
25 |
1 |
1 |
25 |
8 |
$42.32 |
$338.56 |
Liaison/Child Welfare Specialist Interview Protocol |
23 |
1 |
1 |
23 |
8 |
$64.69 |
$517.52 |
Tailored Services Jurisdiction Staff DEI Interview Protocol |
30 |
1 |
.75 |
23 |
8 |
$42.32 |
$338.56 |
Total – Cross-Center Evaluation |
|
|
|
358 |
120 |
|
$7,931.20 |
Center for States (CBCS) Evaluation |
|||||||
Event Registration |
13,500 |
1 |
0.03 |
405 |
135 |
$42.32 |
$5,713.20 |
Brief Event Survey |
1,500 |
1 |
0.10 |
150 |
50 |
$42.32 |
$2,116.00 |
Event Follow-up Survey |
1,500 |
1 |
0.08 |
120 |
40 |
$42.32 |
$1,692.80 |
Event Poll |
300 |
1 |
0.03 |
9 |
3 |
$42.32 |
$126.96 |
Peer Learning Group Survey |
300 |
1 |
0.33 |
99 |
33 |
$42.32 |
$1,396.56 |
Learning Experience Satisfaction Survey |
975 |
1 |
0.33 |
322 |
107 |
$42.32 |
$4,528.24 |
Jurisdiction Interview Protocol |
90 |
1 |
1.00 |
90 |
30 |
$42.32 |
$1,269.60 |
Tailored Services Brief Project Survey |
150 |
1 |
0.13 |
20 |
7 |
$42.32 |
$296.24 |
Peer to Peer Event Survey |
60 |
1 |
0.08 |
5 |
2 |
$42.32 |
$84.64 |
Longitudinal Ethnographic Substudy Jurisdiction Interview |
18 |
2 |
1.00 |
36 |
12 |
$42.32 |
$507.84 |
Total – CBCS Evaluation |
|
|
|
1,256 |
419 |
|
$17,732.08 |
Center for Tribes (CBCT) Evaluation |
|||||||
Request for Services Form |
100 |
1 |
1.00 |
100 |
33 |
$42.32 |
$1,396.56 |
Inquiry Form |
200 |
1 |
0.08 |
16 |
5 |
$42.32 |
$211.60 |
Tribal Demographic Survey |
60 |
1 |
.75 |
45 |
15 |
$42.32 |
$634.80 |
Needs and Fit Exploration Tool Phase 1 |
150 |
1 |
2.00 |
300 |
100 |
$42.32 |
$4,232.00 |
Needs and Fit Exploration Tool Phase 2 (Process Narrative) |
80 |
1 |
3.00 |
240 |
80 |
$42.32 |
$3,385.60 |
Tribal Child Welfare Leadership Academy Pre-Training Self-Assessment |
240 |
1 |
0.50 |
120 |
40 |
$42.32 |
$1,692.80 |
Tribal Child Welfare Leadership Academy Post-Training Self-Assessment |
240 |
1 |
0.50 |
120 |
40 |
$42.32 |
$1,692.80 |
Universal Services Webinar Feedback Survey |
400 |
1 |
0.08 |
32 |
11 |
$42.32 |
$465.52 |
Tribal Child Welfare Jurisdiction Staff Interviews5 |
25 |
2 |
1 |
50 |
17 |
$42.32 |
$719.44 |
Tribal Child Welfare Jurisdiction Staff Focus Groups |
25 |
3 |
1.5 |
113 |
38 |
$42.32 |
$1,608.16 |
Total – CBCT Evaluation |
|
|
|
1,136 |
379 |
|
$16,039.28 |
Center for Courts (CBCC) Evaluation |
|||||||
CQI Workshop Feedback Survey |
240 |
1 |
0.07 |
17 |
6 |
$68.84 |
$413.04 |
Academy Feedback Survey |
600 |
1 |
0.07 |
42 |
14 |
$68.84 |
$963.76 |
Pre/Post Academy Assessment |
600 |
2 |
0.22 |
264 |
88 |
$68.84 |
$6,057.92 |
Court Improvement Program Capacity Building Feedback Survey |
53 |
2 |
.25 |
27 |
9 |
$68.84 |
$619.56 |
Total – CBCC Evaluation |
|
|
|
350 |
117 |
|
$8054.28 |
Total – All Evaluations |
|
|
|
3,100 |
1,035 |
|
|
A13. Costs
There are no additional costs to respondents.
A14. Estimated Annualized Costs to the Federal Government
The estimated costs for the data collection for the Center-specific and Cross-Center evaluations are noted in table A-3. The estimates include the loaded costs and fees of study team staff time on instrument development, piloting, and OMB clearance; data collection; analysis; and report writing and dissemination. As applicable, the estimates also include other direct costs associated with these activities, such as costs for survey administration software (e.g., Qualtrics), conference calls, recording and transcription services, qualitative and quantitative software packages (e.g., SPSS/SAS, Dedoose/Atlas.TI/NVivo), 508 compliance, conference registration, and travel. Although the project spans 5 years6, the request is for three years of approval. If needed, a request for an extension will be submitted to complete data collection.
The annual cost to the federal government for this collection is (1) $340,563 for the Cross-Center evaluation, (2) $373,422 for the CBCS evaluation; (3) $104,863 for the CBCT evaluation; (4) $27,356 for the CBCC evaluation.7 The total annual cost to the federal government for all activities associated with this collection is $846,204.
Table A-3. Estimated Costs to the Federal Government
Cost Category |
Estimated Costs Cross Center |
Estimated Costs CBCS |
Estimated Costs CBCT |
Estimated Costs CBCC |
Total Annual Costs |
Field Work (Administration) |
$629,726 |
$936,965 |
$145,643 |
$15,619 |
$1,727,953 |
Analysis |
$643,852 |
$768,228 |
$203,901 |
$62,477 |
$1,678,458 |
Publications/Dissemination |
$429,235 |
$161,916 |
$174,770 |
$31,328 |
$797,249 |
Annual costs |
$340,563 |
$373,422 |
$104,863 |
$27,3568 |
$846,204 |
A15. Reasons for changes in burden
This is a request to add new instruments and therefore the total burden estimate shown in table A-2 has increased.
A16. Timeline
The Cross-Center and Center-specific evaluations will be implemented over 5 years. There are three primary phases of the evaluations: Phase 1: Evaluation Planning and Approval (years 1-2 ), Phase 2: Data Collection (years 2-5), and Phase 3: Analysis and Reporting (years 3-5). Phase 1 is primarily focused on drafting and finalizing the evaluation plan, developing and revising instruments, and obtaining OMB approval. Phase 2 (data collection) will begin immediately after OMB approval and continue throughout the 3-year OMB approval window. Specific instruments will be administered at various times during the evaluation, the frequency and timing of which are noted in table A-1. If data collection should extend beyond the 3-year OMB approval period, an extension will be sought.
Phase 3 – analysis and reporting – will occur periodically throughout the period, with CB and Centers using evaluation findings for continuous quality improvement. Findings from information collections will be summarized and tabulated in a series of briefings and reports beginning as soon as 6 months after data collection begins.
There is currently no plan to make the data collected available on the agency’s website or data.gov or in a restricted-access environment as the data are being used to guide internal decision making and are not anticipated to be of benefit to the public.
A17. Exceptions
No exceptions are necessary for this information collection.
Attachments
Appendices
Appendix 1: Legislation
Appendix 2: Evaluation Questions
Appendix 3: Cross-Center Recruitment and Reminder Language
Appendix 4: CBCS Recruitment and Reminder Language
Appendix 5: CBCT Recruitment and Reminder Language
Appendix 6: CBCC Recruitment and Reminder Language
Appendix 7: Cross-Center Recruitment Language
Appendix 8: CBCT Recruitment Language – Tribal Staff Interview-Focus Group
Appendix 9: CBCC Recruitment Language – CIP Survey
Instruments
Instrument 1: Cross-Center – Outcomes of and Satisfaction with Tailored Services
Instrument 2: Cross-Center – Brief Tailored Services Survey
Instrument 3: Cross-Center – Leadership Interview for States and Territories
Instrument 4: Cross-Center – Leadership Interview for CIPs
Instrument 5: Cross-Center – Leadership Interview for Tribes
Instrument 6: Cross-Center – Collaboration and Communication Survey
Instrument 7: Cross-Center – Collaborative Project Team Survey
Instrument 8: Cross-Center – removed from ICR
Instrument 9: CBCS – Event Registration
Instrument 10: CBCS – Brief Event Survey
Instrument 11: CBCS – Event Follow Up Survey
Instrument 12: CBCS – Event Poll
Instrument 13: CBCS – Peer Learning Group Survey
Instrument 14: CBCS – Learning Experience Satisfaction Survey
Instrument 15: CBCS – Jurisdiction Interview Protocol
Instrument 16: CBCS – removed from ICR
Instrument 17: CBCS – Tailored Services Brief Project Survey
Instrument 18: CBCS – Peer to Peer Event Survey
Instrument 19: CBCS – Longitudinal Ethnographic Substudy Jurisdiction Interview
Instrument 20: CBCT – Tribal Request for Services Form
Instrument 21: CBCT – Inquiry Form
Instrument 22: CBCT – Tribal Demographic Survey
Instrument 23: CBCT – Needs and Fit Expl Tool – Phase 1
Instrument 24: CBCT – Needs and Fit Expl Tool – Phase 2
Instrument 25: CBCT – TCWLA Pre-Training Self-Assessment
Instrument 26: CBCT – TCWLA Post-Training Self-Assessment
Instrument 27: CBCT – Universal Services Webinar Feedback Survey
Instrument 28: CBCC – CQI Workshop Feedback Survey
Instrument 29: CBCC – Academy Feedback Survey
Instrument 30: CBCC – Academy PrePost Assessment
Instrument 31: Cross-Center – Tailored Services Focus Group Guide (for states)
Instrument 32: Cross-Center – Tailored Services Focus Group Guide (for CIPs)
Instrument 33: Cross-Center – Liaison/Child Welfare Specialist Interview Protocol
Instrument 34: Cross-Center – Tailored Services Jurisdiction Staff DEI Interview Protocol
Instrument 35: CBCT – Tribal Child Welfare Staff Interview/Focus Group Guide
Instrument 36: CBCC – CIP Capacity Building Services Feedback Survey
1 Centers sometimes use a hybrid approach. For example, the CBCC offers “CQI Workshops” to bring CIP staff together in joint sessions for shared learning and peer connections (an approach that represents constituency/targeted services) and to provide individualized capacity building sessions to tailor the information and help CIPs implement a specific project (which represents a tailored service approach).
2 A hard copy of the surveys will be provided to those who cannot access the surveys online. See Supporting Statement B section B4 for further discussion.
3 Examples of sensitive topics include (but not limited to): social security number; sex behavior and attitudes; illegal, anti-social, self-incriminating and demeaning behavior; critical appraisals of other individuals with whom respondents have close relationships, e.g., family, pupil-teacher, employee-supervisor; mental and psychological problems potentially embarrassing to respondents; religion and indicators of religion; community activities which indicate political affiliation and attitudes; legally recognized privileged and analogous relationships, such as those of lawyers, physicians and ministers; records describing how an individual exercises rights guaranteed by the First Amendment; receipt of economic assistance from the government (e.g., unemployment or WIC or SNAP); immigration/citizenship status.
4 The annual respondent burden and annualized cost varies by year and depends upon the data collection strategies employed.
5 The “Tribal Child Welfare Jurisdiction Staff Interviews” and the “Tribal Child Welfare Jurisdiction Staff Focus Groups” would be conducted using the same data collection instrument. The burden estimate for data collection differs depending on whether the data are collected via interview with a single person or via a focus group (with multiple informants meeting together at the same time). To account for the different burden estimates, this single instrument is presented in two separate lines in this table.
6 The exception to this is the CBCC Center-specific evaluation, which will span 4 years. In Table A-3, the CBCC annual cost = total cost/4.
7 The annual respondent burden and annualized cost varies by year and depends upon the data collection strategies employed.
8 The annual cost to the federal government for the CBCC Center-specific evaluation is lower than for the other two centers in part because CBCC collects data with only 4 relatively brief instruments.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Heidi Melz |
File Modified | 0000-00-00 |
File Created | 2022-05-20 |