Supporting Statement B_ACF Rural Contexts_012921_clean

Supporting Statement B_ACF Rural Contexts_012921_clean.docx

OPRE Study: Human Services Programs in Rural Contexts Study [Descriptive Study]

OMB: 0970-0570

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes





Human Services Programs in Rural Contexts Study



OMB Information Collection Request

New Collection





Supporting Statement

Part B



February 2021









Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, DC 20201


Project Officers:

Aleta Meyer, PhD

Lisa Zingman, MSPH


Part B


B1. Objectives

Study Objectives

This information collection intends to capture data on the challenges and unique opportunities of administering human services programs in rural communities. The following are the goals of this study:

  1. Provide a rich description of human services programs in rural contexts

  2. Determine the unmet need for human services in rural communities

  3. Identify opportunities for strengthening the capacity of human services programs to promote the economic and social wellbeing of individuals, families, and communities in rural contexts


To meet these goals, OPRE and its contractor (the study team) have conducted an extensive literature review, will compile and analyze existing administrative data from various programs, and conduct site visits in 12 rural communities (the subject of this ICR) with human service program leadership and staff, and staff from nonprofit and partner organizations that support individuals that utilize human services.



Generalizability of Results

This study is intended to present internally valid descriptions of human services programs in rural areas, not to promote statistical generalization to other sites or service populations.


Appropriateness of Study Design and Methods for Planned Uses

Our study design is appropriate for the proposed purposes and uses planned as our design will use a literature review and administrative and survey to datareduce the data collection burden on respondents. Our qualitative data collection will obtain data from a cross-section of human service providers and partners in each community to generate insights (see Supporting Statement A2. “Study Design” for additional from program administrators and staff. information)


As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.




B2. Methods and Design

Target Population

For each of the 12 selected case study sites, the study team will collect information from at least 4 human services programs (HMRF, TANF, HPOG, and MIECHV, early childhood development programs, family development programs, employment programs, and higher education and technical training programs), program directors, and leadership and staff, and from representatives of nonprofit or partner organizations that support individuals who utilize human services. The sampling frame for each site will consist of leadership and staff who support the human services programs, as well as representatives from area nonprofits or partner organizations that provide support to individuals who utilize human services. The study team will use non-probability, purposive sampling to identify potential respondents who can provide information on the study’s key constructs. Because participants will be purposively selected, they will not be representative of the population of HMRF, TANF, HPOG, and MIECHV or other human services program leadership or staff, or of the area nonprofits or partner organizations who provide support to individuals that utilize human services.



Sampling and Site Selection

The study team will select sites using a purposive and convenience sampling of suggestions made by ACF staff and the technical working group engaged in this project. Examples of proposed sites include: Tribal Communities in Alaska; Appalachia; Mississippi Delta area; Native American lands; Nacogdoches, Texas; Oklahoma; and South Carolina. Following the determination of potential eligible sites, a final list of sites will be determined by investigating the following criteria in each site: level of unmet need, community characteristics (e.g., race and ethnicity demographics, tribal populations, public transportation options, etc.), rurality, geography, administrative structure, proportion of participation in human services in human service, and utilization of innovative service provision in the community.


The study team will recruit respondents in the 12 selected case study sites by asking the human services program director with whom we have made contact with to obtain the names and contact information of individuals involved in program activity execution and leadership, as well as the contact information for local nonprofit or partner organizations that provide support to individuals who utilize human services. Since this is a convenience sample, we cannot calculate the representativeness of that sample.



B3. Design of Data Collection Instruments

Development of Data Collection Instruments

After the study team finalized the research questions, we determined what data collection efforts were needed to answer each question. We developed initial interview items to capture data about each element of the relevant research questions. We then constructed interview protocols (Instrument 2, 3, and 4) that contained relevant questions for each type of respondent. We then cross-walked the protocols with our research questions to ensure that any items that could be answered by documents, or that were not directly related to a research question, were removed.


Although we might need to conduct site visits virtually due to the ongoing COVID-19 pandemic, the instruments will be the same for in-person or virtual site visits, although the directions may vary. Only the planning documents will differ depending on the type of site visit and we have provided two versions to account for an in-person (instrument 1a) or virtual visit (instrument 1b).



B4. Collection of Data and Quality Control

The study team (staffed by 2M Research and the Urban Institute) will collect all interview data. The study team will identify interviewees by each program director (using the Site Visit Planning Template, instrument 1a and 1b). As documented in Supporting Statement A2, we expect each program director, or their designee, will spend 2 hours setting up the interview schedule at their site for each site visit (1 visit per site). The study team will cross-check the list of proposed interviewees with lists of known program-involved staff. Program directors will also be asked to provide information and staff names of nonprofits or partner organizations that provide support to individuals who utilize human services. The study team will provide the program director an email template that introduces the study team, to send to all potential respondents (Appendix C and D, Email from Program directors to Survey Invitees Introducing the Data Collection Effort – In-Person or Virtual). The study team will use the contact information provided by the program director to email the respondents and schedule interviews. All of the interviewers will be trained before they travel to a site. The study team will confirm interview days/times with each respondent by email 3 days before the interview and will tailor interview scripts to each site according to considerations of the interviewee role, the local context, and data saturation and in relation to quantitative and other qualitative data as they become available. Finally, with the permission of the respondent, the study team will audio record interviews.


Study leadership will review complete study notes for thoroughness, accuracy, and clarity. The study team will share unclear elements with the interviewee for clarification. If the current COVID-19 pandemic makes it too difficult to travel safely, the study team will conduct these interviews virtually using virtual meeting platforms(if the respondents have access to internet capabilities) or by telephone (if respondents do not have access to internet capabilities). The study team will conduct interviews using WebEx conference lines, which have audio recording capability.


Table 1. Number and Type of Respondents by Instrument, Per Site


Instrument

Type of respondent

Number of respondents per site visit per data collection

Number of data collections per 2-year OMB clearance

Site Visit Planning Template (Instrument 1a or 1b)

Program director or designee

1

1

Program Directors and Leaders Site Visit Discussion Guide (Instrument 2)

A sample of program directors and leadership from human services programs

5

1

Staff Site Visit Discussion Guide (Instrument 3)

A sample of program staff from human services programs

9

1

Nonprofit or Partner Organizations Site Visit Discussion Guide (Instrument 4)

A sample of staff from nonprofit and partner organizations

6

1




B5. Response Rates and Potential Nonresponse Bias

Response Rates

The study team expect a very high response rate for individuals scheduled for site visit interviews (instruments 1–4). As described in Supporting Statement A12, interview subjects will be a convenience sample, and this data collection is not a representative sample. For each site visit (1 per site at 12 sites), we expect to interview 5 program directors or leaders, 9 program staff members, and 6 staff from nonprofit or partner organizations.


Non-Response

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in written materials associated with the data collection. As participants are not randomly selected, and the study’s findings will not be generalized, the study team will employ a complete case analysis. We will not institute any strategies to impute any missing data.



B6. Production of Estimates and Projections

The data will not be used to generate population estimates, either for internal use or dissemination.



B7. Data Handling and Analysis

Data Handling

For instrument 1a and 1b, the study team will obtain a list of human service program directors (e.g., HMRF, TANF, HPOG, and MIECHV, as well early childhood development programs, family development programs, employment programs, and higher education and technical training programs funded in the community) from the federal program offices in each of our selected sites.



Qualitative data collected through instruments 2, 3, and 4 consist of interviewees’ perceptions of program activities in the rural context. As such, these data are subjective. The study team will take notes during the interviews, as well as make digital audio recordings, to ensure the responses are accurately captured. The study team will transcribe the digital audio recordings and transcripts received will be reviewed by the study team, who will correct any misspellings or other information that would impede coding of the data. The data will be stored on secured servers and will only be accessed by authorized members of the study team.


Data Analysis

This study will be conducted using three data sources: literature reviews, existing administrative and survey data, and case studies. To limit burden on respondents, this study will maximize the use of existing data and limit primary data collection to 12 rural sites. To answer the study’s overarching research questions and related sub-questions, the team will conduct qualitative thematic analysis and qualitative comparative analysis (QCA) using interview transcripts in conjunction with hot spot analysis of existing administrative data. These data will help the study team provide a rich description of human services programs in rural contexts and help identify opportunities for strengthening programs’ capacity to promote economic and social wellbeing.



Qualitative Thematic Analysis: In qualitative research, data analysis is a complex and ongoing process often initiated during the actual data collection, as analysts start identifying potential themes based on the coding of the collected data. To support the analysis, we will draw on a grounded theory approach to become immersed in the data from each rural case study site; this approach will encourage openness to emergent themes (Glaser, 2005) and generation of additional theories about program implementation based on interpretive procedures (Haig, 1996).


Trained study team members will participate in coding the interview transcripts. All members of the site-visit teams (consisting of two researchers who have experience collecting and analyzing qualitative data) will participate in coding interview transcripts. A senior analyst, supervised by the case study task lead, will develop a codebook for use in NVivo, based on the Evaluation Planning Matrix (EPM). Coders will participate in a training on how to use the codebook to code the interview and observation notes in NVivo. The study team will use the NVivo software package to code and analyze all qualitative data.


The study team will code each interview document, and to ensure that codes are being applied throughout the coding process, the study team will double-code 15 percent of the interview data and calculate an interrater reliability (IRR) statistic. If the IRR is less than 80 percent, we will use a negotiated agreement approach to identify discrepancies and meet a minimum of 80 percent IRR. The study team will conduct periodic check-ins throughout the coding and analysis process to ensure that all data are coded consistently, and any questions or concerns of the coding team are addressed.


Once all data are coded, the study team will conduct thematic analysis to identify themes within each construct or code. The study team will analyze the coded interviews to develop a set of themes that is represented in the data across the key research questions in the EPM. The team will be engaged in an interactive process of qualitative data review, in which we simultaneously code, display, and reduce data; reorder and reflect on the data; draw conclusions; and verify assertions (Miles, 1994).


The study team will analyze each site separately to create a holistic and thorough understanding of the nuances of each program and rural site. Also, use of NVivo will allow the team to determine emerging high-level themes. Once the thematic analysis of the data in NVivo is complete, the study team will begin writing up the themes identified as they relate to the research questions and any emergent topics, where relevant. The study team will also explore the use of data visualization tools to convey the most commonly used words/phrases related to implementation processes and activities employed by each programs’ collaborative.


Qualitative Comparative Analysis: Research involving small numbers of cases (typically fewer than 30) often cannot draw on regression or other probabilistic methods to add robustness to the qualitative data. However, another method, QCA (Ragin, 2000), can address this challenge. Developed explicitly for small-N research, QCA bridges qualitative and quantitative analyses by combining the in-depth knowledge gained from case studies with principles derived from Boolean algebra. The QCA method examines set theoretic relationships (e.g., if X, then Y) and assesses how combinations of conditions come together to produce particular outcomes. In particular, this method considers both necessary and sufficient conditions that lead to the occurrence of an outcome of interest. Examination of necessary and sufficient conditions is especially useful for human services research because the findings that result are generally actionable and relevant to policy decisions. The study team will develop site-level indicators of key elements we believe contribute to human services delivery in rural contexts. As the study progresses, the team will assess whether QCA is appropriate to include as part of the cross-site analyses, so we can analyze data from diverse programs in dissimilar settings.


Hot Spot Analysis: The study will analyze relevant secondary data to examine the distribution of human services funds, client populations within rural counties, and then determine the level of unmet need for human services programs across rural counties in the United States. A subsequent geographic information system (GIS) analysis will be used to generate maps depicting the distribution of human services funds and the level of unmet need for human services in rural counties. Geographic Information System (GIS) maps are a useful way to show spatial distributions, but they do not provide any information on whether the differences detailed on the map are statistically different from one another or are different simply due to random chance. The study team will use hot spot analysis to test whether the estimate of unmet need in a county is statistically higher or lower than the estimates of unmet need in neighboring counties. If a county’s estimate is statistically higher than its neighbors, it is referred to as a “hot spot.” If a county’s estimate is statistically lower than its neighbors, it is referred to as a “cold spot.” The study team can report the results of the hotspot analysis using a GIS map that identifies the hot spot and cold spot counties.


This study is registered at the Center for Open Science, https://osf.io/6nyhc.



Data Use

The study data may be archived for restricted data use. In the event that it is archived, the study team will develop a public-facing codebook, including variable names, and describe the major analysis codes used.



B8. Contact Person(s)

Name

Organization

Role on Contract

Phone/Email

Aleta Meyer

OPRE, ACF, HHS

Co-Contracting Officer’s Representative

202-401-5262  

aleta.meyer@acf.hhs.gov 

Lisa Zingman

OPRE, ACF, HHS

Co-Contracting Officer’s Representative

202-260-0323

Lisa.Zingman@acf.hhs.gov 

Aira Jae Etheridge

MCHB, HRSA, HHS

HRSA Lead

404-562-4102

AEtheridge@hrsa.gov

Dallas Elgin

2M Research

Project director; Analysis, synthesis, and interpretation Task Lead

703-214-1004 delgin@2mresearch.com

Laurie Hinnant

2M Research

Case Study Task Lead

470-548-7050 lhinnant@2mresearch.com

James Murdoch

2M Research

Administrative and Secondary Data Task Lead

817-856-0869

cmurdoch@2mresearch.com

Heather Hahn

Urban Institute

Co-Principal Investigator

HHahn@urban.org

Corianne Scally

Urban Institute

Co-Principal Investigator

202-261-5653

CScally@urban.org




Attachments



Instrument 1a: In Person Site Visit Planning Template

Instrument 1b: Virtual Site Visit Planning Template

Instrument 2: Program Directors and Leaders Site Visit Discussion Guide

Instrument 3: Staff Visit Site Visit Discussion Guide

Instrument 4: Nonprofit or Partner Organizations Site Visit Discussion Guide

Appendix A: Responses to Federal Register Notice Comments

Appendix B: Institutional Review Board Approval

Appendix C: Initial Email to Program Directors– In Person

Appendix D: Initial Email to Program Directors - Virtual

Appendix E: Email from Program Directors to Respondents-In Person

Appendix F: Email from Program Directors to Respondents-Virtual



References

Glaser, B. G. (2005). The grounded theory perspective III: Theoretical coding. Mill Valley, CA: Sociology Press.

Haig, B. D. (1996). Grounded theory as scientific method. Philosophy of education 1995 yearbook. Champaign, IL: Philosophy of Education Society.

Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.). Thousand Oaks, CA: Sage Publications.

Ragin, C. C. (2000). Fuzzy-set social science. Chicago, IL: The University of Chicago Press.



8


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKelly McAleer, MSPH
File Modified0000-00-00
File Created2021-03-20

© 2024 OMB.report | Privacy Policy