Home Visiting Career Trajectories
OMB Information Collection Request
New Collection
Supporting Statement
Part A
January 2018 (Revised June 2018)
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
330 C Street, S.W.
Washington, D.C. 20201
Project Officer:
Tia Zeno
Contents
Part A. Justification 1
A.1 Necessity for the Data Collection 1
A.2 Purpose of Survey and Data Collection Procedures 3
A.3 Improved Information Technology to Reduce Burden 7
A.4 Efforts to Identify Duplication 8
A.5 Involvement of Small Organizations 8
A.6 Consequences of Less Frequent Data Collection 9
A.7 Special Circumstances 9
A.8 Federal Register Notice and Consultation 9
A.8.1 Federal Register Notice 9
A.8.2 Consultation with Experts Outside the Study 9
A.9 Incentives for Respondents 10
A.10 Privacy of Respondents 10
A.11 Sensitive Questions 11
A.12 Estimation of Information Collection Burden 12
A.13 Cost Burden to Respondents or Record Keepers 14
A.14 Estimate of Cost to the Federal Government 14
A.15 Change in Burden 14
A.16 Plan and Time Schedule for Information Collection, Tabulation, and Publication 14
A.17 Reasons Not to Display OMB Expiration Date 15
A.18 Exceptions to Certification for Paperwork Reduction Act Submissions 15
References 15
The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS), in collaboration with the Health Resources and Services Administration (HRSA), seeks approval to collect information from staff in home visiting programs receiving funding through the federal Maternal, Infant, and Early Childhood Home Visiting (MIECHV) program. ACF is interested in collecting information about the state of the home visiting workforce, career trajectories of home visitors, and strategies for building a pipeline of highly-qualified home visitors and supervisors. A highly-qualified home visiting workforce is a critical component of ensuring that home visiting programs achieve desired outcomes, but little is known about the current state of the workforce—why they enter home visiting and why they leave—and the type of professional development that supports them. The proposed study will provide important information on these topics and inform the systems that are required for cultivating the most effective workforce.
Specifically, through the proposed information collection, the researchers will obtain quantitative and qualitative information about the characteristics, qualifications, and career trajectories of home visiting staff, as well as the state of the professional development system that supports home visitors. The study will include a national survey of the MIECHV workforce, case studies of eight distinct sites that vary in terms of geography, population demographics, labor markets, and home visiting program offerings, as well as interviews with home visiting training and technical assistance experts. The survey will collect data from program managers, supervisors, and home visitors; the case studies will include in-depth interviews with home visiting program managers and supervisors, and focus groups with home visitors.
Please see a memorandum providing additional justification for the scope and for the practical utility of this proposed study in response to OMB/OIRA comments (SUPPLEMENTAL ATTACHMENT).
Home visiting programs are an important policy lever for improving child development, reducing child abuse and neglect, reducing maternal depression, and reducing maternal and child mortality (Ammerman et al. 2013; Avellar and Supplee 2013; Olds et al. 2014; Peacock et al. 2013). Originally developed both domestically and internationally in the 19th century as a way to educate new mothers in poor areas, early childhood home visiting programs expanded dramatically in the last few decades as a way to improve overall family wellbeing and serve the hardest-to-reach families. Following intensive evaluation of multiple program models, we now have reliable evidence that these programs provide important benefits to mothers, children, and more recently, fathers (Sama-Miller et al. 2016; Sandstrom et al. 2015). However, the evidence of effectiveness is mixed across program models and populations served (Avellar et al. 2016).
The MIECHV program, authorized by the Social Security Act, Title V, Section 511 (42 U.S.C. 711) is a federal initiative that provides grants to all 50 states, and the District of Columbia, as well as U.S. territories and tribal entities to provide evidence-based home visiting services to support at-risk families. States, territories and tribal grantees decide which evidence-based models to implement, and local implementing agencies (LIAs) are relied upon to administer services to individual families.
A major challenge in the field is the wide range of target outcomes. Programs attempt to reduce child maltreatment and improve maternal and child health, child development, parenting practices, and family economic self-sufficiency. The breadth of competencies required of the home visiting workforce to reach those outcomes is extensive. Moreover, home visitors often work within the context of homes with disproportionately high levels of mental health issues (Ammerman et al., 2010), substance use (Dauber et al., 2017), and domestic violence (Davis, James, & Stewart, 2010; Eckenrode et al. 2000)—outcomes targeted in the MIECHV program. Because of the intensity of job demands, quality training and support of home visitors and supervisors is critical.
Additionally, the stress of serving a high-needs population, coupled with low pay and limited benefits may lead to staff turnover among home visitors. For example, the 2015 program information report (PIR) for Head Start grantees found that nearly one in five Early Head Start home visitors left the program during that year. High turnover can be costly to home visiting agencies that need to spend resources finding and training new staff. Additionally, turnover can cause discontinuity for clients. The issue of turnover can be even more pronounced in rural communities where home visitor positions may be harder to fill.
Across home visiting programs nationally, the field knows very little about the home visiting workforce and their career trajectories: why they enter the home visiting field, why they stay in or leave the field, their background and qualifications, job satisfaction and career goals, and work environment.
In 2016, ACF in collaboration with HRSA awarded the Urban Institute a contract to conduct a study that addresses some of these questions about the home visiting workforce.
Products resulting from this research will describe the MIECHV-supported home visiting workforce, including information about the size and demographics of the workforce, qualifications and backgrounds of home visitors, their roles and responsibilities as home visitors, factors associated with recruitment and retention of home visiting staff, details about professional development opportunities available to and desired by home visitors, and information about potential career trajectories. These findings will be beneficial to home visiting program managers and local, state, and federal government agencies seeking to strengthen the home visiting workforce.
There are no legal or administrative requirements that necessitate the collection. ACF is undertaking the collection at the discretion of the agency.
The purpose of the proposed research is to provide information on the state of the MIECHV-supported home visiting workforce and career trajectories of home visitors, as well as to provide recommendations on strategies to build a pipeline of highly-qualified home visitors and supervisors.
The study approach will include a national survey of the MIECHV-supported workforce as well as site visits to home visiting programs in eight distinct sites. LIAs with MIECHV funding may also have other home visitors whose positions are funded by another source who would also be eligible to participate, so the final survey participants will likely include individuals whose positions are not funded by a MIECHV grant. Each site visit will include in-depth interviews with key informants (home visiting program managers and supervisors) and focus groups with home visiting staff. Additional in-depth interviews will be conducted with home visiting training and technical assistance experts either in-person or by phone.
Upon OMB clearance of the proposed information collection, data collection will take place over several months. Both the survey and the site visits are expected to be implemented during the same time period. Data collection will be staggered to the extent possible, starting with the survey, to avoid priming survey responses based on focus group and interview discussions. Following data collection, the research team will conduct data analysis and synthesis, and will release a final report and four research briefs describing findings.
Exhibit 1 lists the research questions that this study will address and indicates the data sources for each question.
Exhibit 1. Proposed Research Questions and Data Sources to Address Them
Research Questions |
Survey |
Focus Groups |
Key Informant Interviews[1] |
1. What are the characteristics of home visitors and their supervisors, including their demographics, qualifications, and employment history? |
X |
|
|
2. What are the characteristics of home visiting jobs? What schedules do staff work? What is the quality of home visiting jobs in terms of job flexibility, control, and predictability of schedules? What percent of their time do home visitors spend with families and completing other tasks? How much do staff earn? How do job earnings vary by degree and position? What employee benefits do home visiting programs offer their staff? How do employee compensation and benefits compare to other fields? |
X |
X |
|
3. What are the career pathways of home visitors and supervisors? Why do home visitors enter this field? What are home visitors’ career goals and perceptions of advancement opportunities? What is the level of worker job satisfaction? What factors contribute to the recruitment and retention of home visitors? What factors are associated with turnover? |
X |
X |
X |
4. What strategies do programs use to recruit and retain staff? What are program managers’ experiences recruiting qualified job candidates? What competencies are they looking for? What positions are challenging to fill and why? |
X |
|
X |
5. What opportunities and challenges exist for professional development and training? What are the skills and knowledge of the workforce? What opportunities exist for professional development, training, and technical assistance? What training needs does the workforce perceive? Where are perceived gaps in training and supports? |
X |
X |
X |
1. The proposed study includes two separate sets of key informant interviews. One set of key informant interviews will be conducted (primarily in person) with home visiting program managers as part of the case study component of the study. The second set of interviews will be conducted (primarily via phone) with professional development experts.
The study design has three major components: a web-based survey, case studies involving key informant interviews and focus groups, and a separate set of key informant interviews with professional development experts.
Survey
There will be two survey instruments: one version for program managers or a designee who is responsible for managing program operations and overseeing staff hiring and training (INSTRUMENT 1) and another version for home visitors and supervisors (INSTRUMENT 2), which can be linked for staff in the same program. The surveys cover multiple domains. The program manager survey asks about staffing, funding sources, staff recruitment and retention, training and program management, and characteristics of families served. The staff survey for home visitors and supervisors asks about career trajectories, education and training, work schedules, compensation and benefits, job quality, work environment and supervision, interactions with families, and demographic characteristics. The surveys will take an estimated 20-23 minutes each to complete (longer for experienced home visitors than program managers and supervisors, on average).
The research team drew on existing workforce surveys and home visiting research surveys to the extent possible to allow for benchmarking of survey results. The team also developed new items on many novel topics not well covered in other surveys. The survey instrument was pre-tested with home visiting program staff (fewer than 10 respondents) and reviewed by an external Technical Work Group and federal staff (details about the pretesting process are included in Supporting Statement B). The research team will program the final survey instrument in Qualtrics web-based software.
The team will monitor nonresponse, send email reminders, and make targeted recruitment efforts to boost participation and lower nonresponse bias at both the initial stage of the program manager survey and then the second stage of the staff survey. These efforts will help ensure the final sample is representative of the distribution of staff across geography (states, territories, and tribal grantees) and across the 11 evidence-based home visiting models implemented with MIECHV funding. As needed, data will be weighted to adjust for nonresponse.
Case Studies
Case studies of eight selected sites across the US will provide in-depth information on the career pathways of home visitors and supervisors, and the factors that support or challenge staff recruitment and retention. Each of the eight selected sites will include up to five LIAs, for a potential total of 40 LIAs in the case study sample.
Once states and a preliminary set of LIAs have been identified (sampling methods are summarized in Supporting Statement Part B), the team will begin reaching out to gauge potential willingness of the programs to participate in key informant interviews and help us to recruit their staff for focus group participation. The team will inform MIECHV state administrators in selected sites about the study and rely on the assistance of local stakeholders and federal partners for recruitment guidance. The team will strive to limit burden on programs that are often asked to participate in research projects and will be sensitive to considerations about how to contact staff and how to pursue the most appropriate channels (e.g., first reaching out to state grantee leads). Upon OMB approval, the research team will reach out to LIAs to formally invite them to participate and to schedule site visits. The team will conduct up to ten key informant interviews (INSTRUMENT 3) per site visit (a maximum of two per LIA, typically staff in the positions of program director, manager, or supervisor), each lasting about 90 minutes. Additionally, the team will conduct up to five focus groups (INSTRUMENT 4) per site visit (one per LIA), with a maximum of 12 participants in each group, each lasting two hours. Focus group participants will be asked to complete a short, anonymous questionnaire (INSTRUMENT 5).
The LIAs selected for the case studies will also be invited to participate in the survey, to ensure that all MIECHV-supported programs have an opportunity to participate in the study. Recruitment efforts will be coordinated so LIAs are contacted about the case studies in advance of survey invitations being sent. Recruitment phone calls for the case studies will notify program managers about the upcoming opportunity to participate in the survey and to encourage survey completion before the site visit date. For the approximately 40 LIAs in the case study sample, survey completion will be closely monitored to stagger survey and case study data collection. Home visitors that agree to participate in focus groups will be asked to complete the survey before attending the focus group. Any LIA that declines case study participation can still participate in the survey and will be encouraged to do so.
Key Informant Interviews on Professional Development
In addition to the case studies described above, in-depth interviews with key informants will provide information on the professional development system for home visitors (INSTRUMENT 6). Approximately thirty (30) interviews will be conducted with experts on the topic of training and technical assistance for home visitors, each lasting about 90 minutes. Experts may be employed at universities, policy research institutes, non-profit organizations, or home visiting agencies. Most interviews will be conducted by phone. Some interviews may be conducted in-person in the event that the case study team can arrange to meet with experts while on a site visit in their local area.
The survey will be sent to respondents via email, and respondents will complete the survey online, which presents a significantly lesser burden than a paper version of the same survey.
For the case study component, the data collection team will travel to participants’ office locations to reduce the travel burden on participants. All interviews and focus groups will be audio recorded (with participant consent, see ATTACHMENT A AND ATTACHMENT B), which will allow the notetaker to easily fill in information they may have missed while taking notes. Participants can speak freely and not worry about speaking too fast or repeating themselves. Additionally, the notetaker will take notes using a laptop, allowing the notetaker to quickly and accurately capture conversation in both interviews and focus groups.
The data requirements for this study have been carefully reviewed to determine whether the needed information is already available. Efforts to identify duplication included a systematic literature review and discussions with knowledgeable experts. This extensive background research concluded that no existing data source can provide the data needed to answer the study’s research questions. There is very limited information available about the state of the home visiting workforce, especially related to workforce characteristics, factors associated with turnover, and career trajectories. This study will provide an in-depth investigation into these issues and others related to the home visiting workforce in MIECHV-supported programs.
Some organizations involved in this research are small not-for-profit organizations. The research team will minimize burden for individuals within these organizations by asking only about information that is directly related to the study’s aims and reducing the time needed for participation however possible. The survey is web-based and participants will be able to complete it at a time that is convenient to them. In-depth interviews and focus groups will be held either on-site (at the organization being studied) or in close geographic proximity. Interviews will be scheduled at a time and date that is convenient for the interviewee, within a window of times available to the research team, and focus groups will be held both during morning and afternoon time periods to facilitate ease of scheduling for participants.
Data collection will occur only once for each of the study components. Reducing or eliminating any data collection component would reduce the researcher’s ability to collect information about home visitors, to answer the proposed research questions, to achieve the government’s goals for this project, and to disseminate findings more widely.
There are no special circumstances for the proposed data collection.
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on September 27th, 2017 (Document Number 2017-20676; Page 45029) and provided a 60-day period for public comment. No comments were received. A copy of this notice is included in ATTACHMENT C.
Outside experts were consulted as part of the study design process. In December 2016, the research team held a webinar for evidence-based home visiting model developers to inform them about the proposed study and to solicit feedback on the proposed study design. Representatives from six model developers participated in the discussion.
In February 2017, the research team held an in-person meeting with its Technical Work Group (TWG). The TWG is made up of seven home visiting experts with a range of experiences including workforce research, home visiting program implementation, and case study and survey methodological expertise. The TWG members include:
Claire Dunham, Senior Vice President of Programs & Training, Ounce of Prevention Fund
Jon Korfmacher, Associate Professor, Erikson Institute
Lili McGuinness, Program Director, Welcome Baby
Jordon Peugh, Vice President, Health Policy & Public Opinion Research , SSRS
Jessica E. Sowa, Associate Professor, School of Public and International Affairs, University of Baltimore
Jodi Whiteman, Director of Professional Development, ZERO TO THREE
Paula Zeanah, Director of Research for the Picard Center and the Lafayette General Medical Center, University of Louisiana at Lafayette
The research team gathered extensive feedback from these consultations, and the recommendations from experts helped shape the research design, especially with regards to the survey and case study instruments.
No monetary incentive will be offered to survey, interview, or focus group participants.
Survey: The survey itself does not collect any personally identifiable information (PII), but some information could be considered sensitive, such as reports of job satisfaction, ratings of supervisors, and intentions to quit. In order to interview home visiting staff, Urban researchers will have access to (PII)—specifically work email addresses—that alone or combined with publicly available data, could be used to identify individual persons. The PII will not be copied or disclosed to anyone outside the survey data collection team. Respondents will be informed that their contact information will not be shared with anyone outside the survey data collection team. Respondent email addresses will not be linked to individual survey responses.
Case Study: Prior to the start of each interview and focus group, the researchers will assure the respondents that the information provided will be kept private to the extent permitted by law. Specifically, none of the information obtained during the course of the study will be disclosed in such a way that individuals can be identified by anyone outside the research team, and the respondents will not be quoted by name in dissemination activities, such as the final research report, research briefs, federal briefings, and conference presentations.
All interview and focus group respondents will be given and asked to sign informed consent forms before the start of the interview or focus group.
The organizations participating in the study will not be identified by name in any reports or other dissemination activities and descriptive information that would allow the organization to be identified will be limited. For example, some home visiting models are implemented in only a few locations, so disclosure of the mix of models interviewed in a given site might disclose the location. Other information will not be shared with anyone other than the research staff assigned to the study, all of whom will be required to sign the Urban Institute’s Staff Confidentiality Pledge. See ATTACHMENT D.
This study is also under the purview of the Urban Institute’s Institutional Review Board (IRB), which is registered under Federalwide Assurance number 00000189, indicating it adheres to the requirements in the HHS Protection of Human Subjects regulations at 45 CFR Part 46. All data collection and security procedures described in this package received IRB approval effective April 17, 2017. See ATTACHMENT E for a copy of the IRB Notice of Approval. To receive IRB approval for this study, the data collection effort must adhere to the following principles:
Subjects are informed of the nature of the research and how it will be used, and their consent either obtained or explicitly waived, where risks to them are determined to be minimal.
Adequate provision is made to protect the privacy of subjects and to maintain privacy of data, where promised and as appropriate.
Risks to subjects are minimized to the extent possible within research designs.
Risks to subjects (from the research) are reasonable in relation to anticipated benefits (of the research).
The selection of subjects is as equitable as possible (the burdens and benefits of the research are fairly distributed) and particular attention is paid to research involving vulnerable populations and protected health information.
Information will not be maintained in a paper or electronic system from which they are actually or directly retrieved by an individuals’ personal identifier.
The interview and focus group protocols do not cover sensitive topics, however, some individuals may feel averse to sharing certain personal details in a public setting such as a focus group. For example, in focus group discussions, the researcher will ask about elements of job satisfaction and job stressors, which some individuals might find sensitive to discuss with colleagues in the group. All participants will be informed that they can choose not to answer any question and can stop the interview or leave the focus group at any time. Respondents will also be reminded that their responses will be kept private, to encourage their candid responses.
Similarly, the survey asks home visitors and supervisors about job satisfaction, future career plans including possible intent to leave, and working environment, which might be sensitive to some. The program manager survey does not cover any potentially sensitive topic areas. Although survey respondents are encouraged to answer every question, they are informed that they can choose to refuse a question or stop participation at any time.
All respondents are informed that their participation is voluntary.
Exhibit A-1 shows estimated burden of the information collection, which will take place within a one-year period.
Program Manager Survey (INSTRUMENT 1): Survey with up to 700 respondents at an average length of 20 minutes. There are 705 LIAs and 19 tribal grantees funded by MIECHV who will be invited to participate.
Home Visitor and Supervisor Survey (INSTRUMENT 2): Survey with up to 3,000 respondents at an average length of 23 minutes. HRSA estimates 3,074 home visitor FTEs and 711 supervisor FTEs are funded by MIECHV in 2017.
Key Informant Interview Guide for Management and Supervisory Staff (INSTRUMENT 3): Interviews with 80 home visiting program managers and supervisors (10 staff across 8 sites), at an average length of 90 minutes.
Focus Group Moderator’s Guide (INSTRUMENT 4): Focus groups with up to 12 home visitors per focus group at each of five focus groups across 8 sites (N=480), at an average length of 120 minutes.
Self-Administered Questionnaire for Focus Group Participants (INSTRUMENT 5): Self-administered questionnaire for each focus group participant (N=480), at an average of two minutes per response.
Key Informant Interview Guide for Training and Technical Assistance Experts (INSTRUMENT 6):
Interviews with 30 key informants, including training and technical assistance experts, at an average length of 90 minutes.
Figure A-1: Estimated Burden in Annualized Hours and Costs
Instrument |
Total/Annual Number of Respondents |
Number of Responses Per Respondent |
Average Burden Hours Per Response |
Annual Burden Hours (rounded) |
Average Hourly Wage ($) |
Total Annual Cost ($) |
Program manager survey |
700 |
1 |
0.33 |
231 |
$33.38 |
$7,710.78 |
Home visitor and supervisor survey |
3,000 |
1 |
0.38 |
1140 |
$26.34 |
$30,027.60 |
Key informant interview guide – management and supervisory staff |
80 |
1 |
1.5 |
120 |
$33.38 |
$4,005.60 |
Focus group moderator’s guide |
480 |
1 |
2 |
960 |
$19.30 |
$18,528.00 |
Self-administered questionnaire for focus group participants |
480 |
1 |
0.03 |
14 |
$19.30 |
$270.20 |
Key informant interview guide – training and technical assistance experts |
30 |
1 |
1.5 |
45 |
$34.14 |
$1,536.30 |
Estimated Annual Burden Sub-total |
|
|
|
2,510 |
|
$62,078.48 |
Total Annual Cost:
The estimated total annualized cost burden to respondents is based on the maximum expected burden hours for each instrument and estimated hourly wage rates for each data collection instrument, as shown in the two right-most columns of Exhibit A-1. These estimates are based on an average of the mean hourly wage for “Social and Community Service Managers” and “Community Health Workers”:
an assumed hourly wage of $33.38 for program managers, based on the mean hourly wage for “Social and Community Service Managers”, as reported in the May 2015 U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wage Estimates, http://www.bls.gov/oes/current/oes119151.htm
an assumed hourly wage of $19.30 for home visitors, based on a mean hourly wage for “Community Health Workers,” as reported in the May 2015 U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wage Estimates, http://www.bls.gov/oes/current/oes211094.htm.
For training experts, an average hourly salary of approximately $34.14 is assumed based on the Bureau of Labor Statistics (BLS) estimates for median hourly wages for management, professional and related workers in the nonprofit industry, as reported by U.S. Department of Labor, Bureau of Labor Statistics “Nonprofit pay and benefits: estimates from the National Compensation Survey.” 2016. https://www.bls.gov/opub/mlr/2016/article/nonprofit-pay-and-benefits.htm
There are no additional costs to respondents or record keepers.
The total cost to the federal government of implementing the proposed data collection activity and data analysis is $616,336 and is outlined in the proposed budget developed for this study. This amount includes all costs related to study design, instrument development, information collection, and analyses of data. This is a one-year information collection request.
This is a new data collection.
Data collection will begin following OMB approval. Findings from analysis of the information collected through on-site interviews will be presented by the research contractor in a final research report, expected in late 2018. This report will be publically disseminated through OPRE and the Urban Institute, and analyses will likely be submitted for publication in peer-reviewed professional journals. Findings will also be presented at research and practitioner conferences.
Expected Time Period |
Activity |
During OMB review period |
Preparation for fieldwork |
0 – 3 months following OMB approval |
Survey recruitment and data collection
Key informant interviews with training and technical assistance experts
|
1- 3 months following OMB approval |
Data collection for case studies
Additional key informant interviews with training and technical assistance experts
|
4 – 6 months following OMB approval |
Data analysis
|
7 – 8 months following OMB approval |
Write final report and research briefs |
All instruments will display the expiration date for OMB approval.
No exceptions are necessary for this information collection.
Ammerman, R. T., Putnam, F.W., Bosse, N. R., Teeters, A. R., & Van Ginkel, J. B. (2010). Maternal depression in home visitation: A systematic review. Aggression and Violent Behavior, 15(3), 191-200.
Ammerman, R. T., Putnam, F. W., Altaye, M., Teeters, A. R., Stevens, J., & Van Ginkel, J. B. (2013). Treatment of depressed mothers in home visiting: Impact on psychological distress and social functioning. Child abuse & neglect, 37(8), 544-554.
Avellar, S. A., & Supplee, L. H. (2013). Effectiveness of home visiting in improving child health and reducing child maltreatment. Pediatrics, 132(Supplement 2), S90-S99.
Avellar, S., Paulsell, D., Sama-Miller, E., Del Grosso, P., Akers, L., & Kleinman, R. (2016). Home visiting evidence of effectiveness review: Executive summary. Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. Washington, DC.
Dauber, S., Feryayorni, F., Henderson, C., Hogue, A., Nugent, J., & Alcantara, J. (2017). Substance use and depression in home visiting clients: Home visitor perspectives on addressing clients’ needs. Journal of Community Psychology, 45(3), 396-412.
Davis, L., James, L., & Stewart, K. (2010). Realizing the promise of home visitation: Addressing domestic violence and child maltreatment. Family Violence Prevention Fund.
Eckenrode, J., Ganzel, B., Henderson, C.R., Smith, E., Olds, D.L., Powers, J., Cole, R., Kitzman, H., & Sidora, K. (2000). Preventing child abuse and neglect with a program of nurse home visitation: The limiting effects of domestic violence. Journal of the American Medical Association. 84(11).
Olds, D. L., Kitzman, H., Knudtson, M. D., Anson, E., Smith, J. A., & Cole, R. (2014). Effect of home visiting by nurses on maternal and child mortality: results of a 2-decade follow-up of a randomized clinical trial. JAMA pediatrics, 168(9), 800-806.
Peacock, S., Konrad, S., Watson, E., Nickel, D., & Muhajarine, N. (2013). Effectiveness of home visiting programs on child outcomes: a systematic review. BMC public health, 13(1), 1.
Sama-Miller, E., Akers, L., Mraz-Esposito, A., Avellar, S., Paulsell, D., and Del Grosso, P. (2016). Home Visiting Evidence of Effectiveness Review: Executive Summary. Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. Washington, DC.
Sandstrom, H., Gearing, M., Peters, H. E., Heller, C., Healy, O., & Pratt, E. (2015). Approaches to Father Engagement and Fathers’ Experiences in Home Visiting Programs.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Rebecca Peters |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |