Virtual Human Service Delivery under COVID-19: Scan of Implementation and Lessons Learned
ASPE Generic Information Collection Request
OMB No. 0990-0421
Supporting Statement – Section A
Submitted: June 1, 2020
Program Official/Project Officer
Pamela Winston
Social Science Analyst
U.S. Department of Health and Human Services
Office of the Assistant Secretary for Planning and Evaluation
200 Independence Avenue, SW, Washington, D.C. 20201
202-774-4952
Section A – Justification
Circumstances Making the Collection of Information Necessary
As the COVID-19 pandemic unfolded, government and public health officials asked the public to engage in physical distancing to limit viral transmission. Many human services agencies shifted from in-person services to providing services primarily by videoconference, over the telephone, or otherwise online. This has created challenges for a field where services such as child protection, home visiting, and case management are often delivered face to face. Many human services organizations quickly tried to establish or scale up virtual service delivery systems. At the same time, agency staff indicate that because of the economic challenges accompanying the pandemic, they often saw a sharp increase in service demand.1
Many human services agencies operated without much information to guide them. While substantial research exists about tele-health service delivery,2 very little has been conducted to date on the virtual delivery of human services. A number of rapid-response efforts have been undertaken in the past few months to develop and distribute guidance quickly to human services agencies shifting from an in-person to a virtual environment, often drawing from tele-health research.3 But the importance and complexity of operating critical human services delivery systems virtually and effectively, currently, and into the future, will require sustained attention and ongoing innovation.
We expect this study to offer a more systematic exploration of virtual human services delivery than currently available elsewhere. Working with a contractor, Mathematica, to assist with data collection, ASPE will conduct a scan of a purposively selected sample of human services programs to help HHS identify preliminary lessons about delivering human services virtually (primarily via videoconferencing, telephone, text, email, instant messaging, web, etc.). ASPE expects to identify and leverage preliminary lessons from the extensive transition to virtual services in Spring 2020 due to the COVID-19 pandemic and related state and local public health orders. Our goal is to explore a range of programs and services where human services are being offered largely virtually for the first time in response to the pandemic.
We aim to understand how the services are provided, and to identify successes, challenges, and preliminary lessons learned from this shift in service approach. We are particularly interested in perceptions related to effectiveness—where there is an understanding that services are delivered less well, as well, or better than in person, and the reasons for this perception. We expect the scan to focus on a range of human services (e.g. child welfare, early learning and development, Temporary Assistance for Needy Families, home visiting, human trafficking, intimate partner violence, family strengthening, and fatherhood programs).
We anticipate focusing on programs provided by about 18 community-level grantees or agencies in about six states, serving a range of populations. Programs will vary in how much they have relied on in-person versus virtual service delivery in the past (e.g. child welfare services have tended to emphasize face-to-face contact, while some TANF programs have included some virtual elements in the past). We seek to gain perspectives from key informants that include administrators and managers at the state, local, Tribal, and community levels; front line staff; and clients/families. The main data collection methods are expected to be semi-structured key informant conversations and focus groups.
We understand that the data we collect from the purposive sample of programs will
be qualitative and not generalizable across HHS or other federal programs or across locations, but can provide illustrative information for others considering ways to improve their delivery of human services by virtual methods. This project will help to identify early lessons about virtual human services delivery across a range of programs, and is likely to be of interest to policymakers, program administrators and managers, and frontline service providers. It will also help ASPE develop future research priorities.
Purpose and Use of the Information Collection
Recent ASPE-sponsored explorations of virtual methods in human services identified several important research gaps, including documentation of current program approaches, successes, and challenges; identification of program components that can best be delivered virtually (or, conversely, that cannot be delivered effectively); and identification of potential measures/indicators of service effectiveness.
This study will address some of these and other important gaps, gathering stakeholder perceptions to better inform our understanding of these issues. Our key research questions for the study program sites include:
What specific human services are delivered virtually, and how (service component, mode, etc.)?
What have been the major successes to date in delivering human services virtually?
What have been the major challenges with delivering human services virtually?
What is the evidence of effectiveness for virtual human services compared with in- person?
What lessons do stakeholders identify about delivering human services virtually?
To what extent do stakeholders view virtual human services delivery as part of a long-term trend versus a short-term response to the pandemic?
What priorities for future research around virtual human services delivery do stakeholders identify?
Use of Improved Information Technology and Burden Reduction
Qualitative semi-structured discussions will be conducted by videoconference and/or phone. This should minimize the burden on individual respondents and allow us to schedule discussions at the convenience of respondents.
Efforts to Identify Duplication and Use of Similar Information
In preparation for writing a recent (March 2020) ASPE brief on virtual case management considerations and resources for human services programs, we conducted a literature scan to identify previous research on this issue. We located no comprehensive studies of the use of virtual methods in human services, and a limited number of studies addressing specific program areas, such as home visiting.4 Experts consulted for a recent (April 28, 2020) ASPE-sponsored panel, Virtual Learning Exchange: Considerations for Successful Virtual Case Management in Human Services, confirmed the dearth of research in this area, as have others with whom we have consulted within the Administration for Children and Families (ACF) at HHS, the American Public Human Services Association, and national other organizations.
Impact on Small Businesses or Other Small Entities
No small businesses will be involved in this data collection.
Consequences of Collecting the Information Less Frequently
This request is for a one-time data collection. Less frequently would mean collecting no data at all.
Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
There are no special circumstances with this information collection package. This request fully complies with the regulation 5 CFR 1320.5 and will be voluntary.
Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency
This data collection is being conducted using the Generic Information Collection mechanism through ASPE – OMB No. 0990-0421, therefore no Federal Register notice is required.
Explanation of Any Payment or Gift to Respondents
Respondents to this data collection fall into two categories: professionals (program administrators/managers and frontline staff, such as caseworkers or home visitors) who will be responding in their professional capacities (about 72), and clients including adults and young adults participating in human services programs such as Head Start (as parents) or TANF (a total of about 42). The professional respondents will not be compensated. For the non-professional respondents, we plan to provide a $25 Visa Gift Card to thank them for their participation and recognize the value of their lived experience. These respondents are voluntarily sharing with us their important perspectives and the gift cards communicate to them that we value their time. While professional respondents will be talking with us as part of their professional roles, the nonprofessional respondents are not otherwise compensated for their time in providing their perspectives.
Evidence shows that remuneration bolsters recruitment and attendance at focus groups. Many of these respondents are parents who may be working or are otherwise busy, and most are likely to be low income and thus face additional barriers to participating in the interview. To ensure that our incentive is not coercive, consent scripts indicate and interviewers will be trained to make it very clear that participants who choose to leave the group early or who prefer not to respond to certain questions will still receive the $25 gift card. Mathematica staff will track the cards that are paid out by completing a log when they send the gift card to participants.
Assurance of Confidentiality Provided to Respondents
Data collected in this study will be kept private to the extent allowed by law. The Privacy Act does not apply to this data collection. State and local program administrators/managers and practitioners who answer questions will be answering in their official roles and will not be asked about, nor will they provide, sensitive individually identifiable information.
With respect to non-professional respondents, our contractor will collect only that identifying information needed to schedule interviews and to send gift cards in appreciation of their participation. The names of study participants will be kept private to extent allowed by law, and will not be used in the analysis or writing the study findings, regardless of whether the names were mentioned in an interview and therefore became part of the transcripts. All focus group participants will be asked to choose a pseudonym for purposes of the group discussion, and will be referred to by that name, or by generic roles that identifying only their respondent type (e.g. Head Start parent or former foster youth). They will be asked to complete a short demographic form at the beginning of the focus group (Attachment C, described further in Supporting Statement B), but will be asked to use their pseudonym when completing that form. In addition, no actual names will be used when documenting the interview notes for analysis.
This data collection is being reviewed by our contractor’s external IRB.
Justification for Sensitive Questions
No information will be collected from professional respondents that is of personal or sensitive nature.
For the non-professional respondents, in some cases, participation in human services can, by its nature, be personal and emotional. However, the perspectives of clients are particularly valuable since they are the most directly affected by changes in policies and practices affecting human services delivery. We will recruit our nonprofessional respondents by publicizing the focus groups with the local agencies and programs being studied through strategies such as email, digital posters or flyers, and word of mouth, making clear that participation, or nonparticipation, has no bearing on the services they receive. Potential focus group participants will be reminded multiple times during the recruitment process, and again prior to the discussions themselves, that participation is entirely voluntary. They will also be reminded that during the group discussions, they can decline to respond to any question and may leave the group at any time. We believe this will minimize the intrusiveness experienced by these participants.
We do not intend to ask about the specifics of participants’ personal situations, but will focus on how they experienced the systems’ delivery of services using virtual methods and how they compare with similar program services they have received in person prior to the programs’ adoption of virtual human service delivery (pre- and post-COVID).
Estimates of Annualized Burden Hours and Costs
The estimate for burden hours is 60 minutes per response from administrators and frontline staff, and 90 minutes per response from program participants. For the scan, we plan to interview approximately 72 professional respondents across 18 programs in 6 states. For the nonprofessional respondents (program participants) we plan to conduct about six focus groups—likely one for a single program in each of the six states, for a total of approximately 42 nonprofessional respondents (an estimated 7 in each group).
Table A-1 shows estimated burden and cost information.
Table A-1: Table 3. Estimated annualized burden hours and costs to respondents
Type of respondent |
Number of respondents |
Number of responses per respondent |
Average burden per response (in minutes) |
Total burden hours |
Hourly Wage Rate5 |
Total Respondent Costs |
State, local, tribal, or community human services program administrators or managers |
36 |
1 |
60 |
36 |
$35.05 |
$1261.80 |
State, local, tribal, or community human services program front line staff |
36 |
1 |
60 |
36 |
$24.27 |
$873.72 |
Adult or young adult program clients |
42 |
1 |
90 |
63 |
$15.03 |
$946.89 |
Total |
114 |
1 |
- |
135 |
- |
$3,082.41 |
Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers
There will be no direct costs to the respondents other than their time to participate in each data collection.
Annualized Cost to the Government
The cost of the government task order attributable to the work is $2999.64.
Table A-2: Estimated Annualized Cost to the Federal Government
Staff (FTE) |
Average Hours per State |
Average Hourly Rate |
Average Cost per State |
Social Science Analyst, GS 14 |
10 |
$49.94 |
$499.40 |
Estimated Total Cost of Information Collection (x6 states) |
$2996.40 |
Explanation for Program Changes or Adjustments
This is a new data collection.
Plans for Tabulation and Publication and Project Time Schedule
After the data collection, Mathematica will provide ASPE with notes from each individual interview, as well as a transcript and summary memo for each focus group, using standardized templates to document key findings across participants. They will also provide a synthesis memo of the qualitative findings.
ASPE will synthesize the qualitative findings, and present the results in one or more research briefs summarizing the study findings. ASPE plans to publically disseminate the overarching findings from this study. The dissemination plan has not yet been determined, but could include presentations, webinars, briefs, or other written products.
Project Time Schedule
June 2020: ASPE and Mathematica seek OMB and IRB approval
June-August 2020: Mathematica conducts outreach, schedules, and conducts discussions with respondents and focus groups with participants
June/July/August 2020: Mathematica analyzes discussion and focus group findings and submits summary memos to ASPE
September-November 2020: ASPE staff develops a brief, memos, and other products as appropriate
Reason(s) Display of OMB Expiration Date is Inappropriate
We are requesting no exemption.
Exceptions to Certification for Paperwork Reduction Act Submissions
There are no exceptions to the certification. These activities comply with the requirements in 5 CFR 1320.9.
LIST OF ATTACHMENTS – Section A
Note: Attachments are included as separate files as instructed.
Attachment A: Recruitment emails
Attachment B: Semi-structured discussion guides, including consent scripts
Attachment C: Demographic questionnaire
1 State TANF leaders presenting on “TANF Responses to COVID-19 – Voices from the Field,” an April 29, 2020, webinar via Zoom sponsored by the National Association of Welfare Research and Statistics.
2 See, for example: “The National Consortium of Telehealth Resource Centers.” Tools and Resources. (2020) Health Resources and Services Administration (HRSA)/HHS. https://www.telehealthresourcecenter.org/
3 These include: Annette Waters, Pamela Winston, and Robin Ghertner (2020), “Virtual Case Management Considerations and Resources for Human Services Programs;” Lauren Supplee and Sarah Shea Crowne (2020), “During the COVID-19 Pandemic, Telehealth Can Help Connect Home Visiting Services to Families;” and an internal ASPE-Institute for Research on Poverty/University of Wisconsin “learning exchange” for federal human services staff on March 28, 2020, entitled “Considerations for Successful Virtual Case Management in Human Service Delivery.”
4 See Supplee and Crowne (2020).
5 Estimates for the average hourly wage for respondents are based on the Department of Labor (DOL) 2019 National Occupational Employment and Wage Estimates (https://www.bls.gov/oes/current/oes_nat.htm). To estimate, we used average wages for: social and community service managers, community and social service occupations, and personal care and service occupations.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | gel2 |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |