PJAC Supporting Statement B_OMB Passback_20200403_clean

PJAC Supporting Statement B_OMB Passback_20200403_clean.docx

Procedural Justice-Informed Alternatives to Contempt Demonstration (PJAC)

OMB: 0970-0505

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes




Procedural Justice-Informed Alternatives to Contempt Demonstration (PJAC)



OMB Information Collection Request

0970 - 0505





Supporting Statement

Part B

February 2020


Submitted By:

Office of Child Support Enforcement

Administration for Children and Families

U.S. Department of Health and Human Services


5th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officer:

Sharon Henderson



Part B


B1. Objectives

Study Objectives

The purpose of the Procedural Justice-Informed Alternatives to Contempt (PJAC) demonstration project is to assess the feasibility and efficacy of incorporating principles of procedural justice into child support services as a cost-effective alternative to civil contempt proceedings. The PJAC demonstration aims to document and evaluate the effectiveness of the approaches taken by six child support agencies that were issued grants to provide procedural justice-informed services to noncustodial parents who have been determined able to pay their child support but are far enough behind in payments that they are facing contempt proceedings.


Procedural justice is sometimes referred to as procedural fairness. Very simply, it is “the idea that how individuals regard the justice system is tied more to the perceived fairness of the process and how they were treated rather than to the perceived fairness of the outcome.1 Research has shown that trust and confidence in legal authorities increases when people experience procedural justice during a process, even if they do not perceive the outcome of the process as favorable to them.


The PJAC study seeks to (1) describe and assess PJAC services implementation; (2) estimate PJAC’s impacts on measures in several domains, including child support enforcement, the contempt process, and reliability of child support payments; and (3) measure the cost effectiveness of the PJAC intervention. There are three study components: (1) an implementation study, (2) a random assignment impact study, and (3) a benefit-cost study. The impact study will test the efficacy of incorporating procedural justice principles into child support practices as a cost-effective alternative to sending noncustodial parents through the contempt process.


Generalizability of Results

This randomized controlled trial evaluation is intended to produce internally-valid estimates of the intervention’s causal impact on noncustodial parents who have met their state’s criteria for referral to civil contempt proceedings for failure to meet their child support obligations. Alongside these impact findings, the evaluation will also produce findings regarding the program’s implementation as well as benefit-cost findings. The evaluation is not intended to promote statistical generalization to other sites or service populations.



Appropriateness of Study Design and Methods for Planned Uses


The study was designed to provide detailed evidence regarding how the overall intervention was implemented in various contexts, whether the package of services is an effective alternative to contempt, whether they are a cost-effective use of government funds, and if so, how they can be replicated. A randomized controlled trial design provides the strongest possible evidence regarding the efficacy of an intervention. This randomized study is intended to produce internally-valid estimates of the intervention’s causal impact, not to promote statistical generalization to other sites or service populations. The design allows only for an assessment of the full PJAC model and will not provide information about the effectiveness of individual model components; this more detailed understanding would have to be obtained via further research. An implementation study that examines the activities, experiences, and perceptions of participants, child support staff, and stakeholders is vital for documenting how the intervention functioned and contextualizing findings from the impact analysis. A benefit-cost study compares the potential benefits of an intervention with its costs to assess whether the intervention is a good investment of taxpayer dollars. The results of the evaluation aim to provide program administrators and policymakers with evidence needed to decide whether and how to implement procedural justice-informed programming in the future.


As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.



B2. Methods and Design

Target Population

The study targets noncustodial parents who are far enough behind in their child support payments that they are facing civil contempt proceedings. Each state has different criteria for referring noncustodial parents to civil contempt. Based on the first year of study enrollment, the annual size of the target population across the six PJAC sites is approximately 5,000. On average, each noncustodial parent is associated with 1.5 custodial parents with whom they share children and a child support case. These custodial parents may also be involved in PJAC services, thus they are included in data collection efforts using Instrument 7: Custodial Parent Interview Protocol. Additionally, child support project directors and case managers and community partner staff are the target of some data collection as well (Instruments 3, 5, and 6) for the implementation and benefit-cost studies.


Sampling and Site Selection

Site selection. In September 2016, the Office of Child Support Enforcement (OCSE) within the Administration for Children and Families (ACF) awarded grants to six child support agencies to provide procedural justice-informed services to noncustodial parents who are far enough behind in their child support payments that they are facing contempt proceedings.2 Grantees and their proposed study sites were selected through a competitive grant-making process that considered the quality of the grantees’ proposed approaches to implementing procedural justice practices and modifying existing processes to accommodate the intervention and the study, among other criteria related to organizational capacity.3


Sampling. Grantees are expected to each randomly assign 2,300 noncustodial parents who are eligible for PJAC over a three-year period, yielding a total of 13,800 sample members across all six sites. Sixty-five percent of these noncustodial parents will be randomly assigned to the treatment group and will be offered PJAC treatment services; the remaining 35 percent will be randomly assigned to the control group and will proceed through their state’s business-as-usual process for filing civil contempt charges.


These sample size targets reflect a reduction from the original goal of 18,000 (3,000 per site) reported in this project’s initial OMB submission. The lowered sample size target of 13,800 provides sufficient statistical power and will lessen enrollment burden on sites. More specifically, minimum detectable effect calculations based on this sample size reduction indicated an estimated loss of just .28 percentage points of statistical power for a pooled analysis (including all six demonstration sites) and .68 percentage points per individual site, both modeled on a binary outcome of having paid any child support in the year following study enrollment. The estimated minimum detectable effects on this modeled outcome for the reduced sample size goals are 2.22 percentage points for a pooled analysis and 5.44 percentage points for each individual site analysis, respectively. OCSE’s position is that these minimum detectable effects are sufficiently low to detect policy relevant impacts. Moreover, lowered sample size goals produce a more manageable caseload size that better fits with sites’ capacity to deliver strong services, ultimately improving dosage and strengthening the evaluation’s test of PJAC services.


While some information collection efforts will include the full study sample, others will subsample using a random number generator in order to provide a representative sample, and still others will use a convenience sample when random sampling is not practicable. The sampling approach for each data collection is summarized in Table B.1. Across all information collections that will not include the full universe of sample members, subsamples were carefully considered to ensure that the purpose of each collection is fulfilled while at the same time minimizing costs and burden on child support staff and study participants. For instruments 5 and 6, randomly sampling from among non-PJAC staff will provide representative information without requiring the added time and resources necessary to collect information from all non-PJAC staff at each child support agency. Instruments 3 (for non-PJAC staff), 4, and 7 represent qualitative collections that are not intended to be representative, thus non-random sampling methods or respondent samples still allow the goals of these information collections to be met. Instruments 1 and 2 will include the full universe of sample members, as will instruments 3, 5, and 6 for PJAC staff only.


Table B.1

Information Collection Instrument

Estimated Number of Persons in Universe

Estimated Number of Persons in Sample

Sampling Method

Previously Approved Request (OMB No. 0970-0505, Approval Date 1/23/2018)

Instrument 1: Staff data entry on participant baseline information

13,800 study sample members (noncustodial parents)

13,800 study sample members (noncustodial parents)

NA

Instrument 2: Study MIS to track receipt of services

8,970 treatment group members

8,970 treatment group members

NA

Instrument 3: Staff and community partner interview topic guide

210 child support staff members (PJAC and non-PJAC project directors and case managers) and community partners (35 per site):

  • 90 PJAC staff/community partners

  • 120 non-PJAC staff

150 child support staff members (PJAC and non-PJAC project directors and case managers) and community partners (25 per site):

  • 90 PJAC staff/ community partners

  • 60 non-PJAC staff

PJAC case managers/community partners: NA



Non-PJAC staff: convenience

Current Request


Instrument 4: Noncustodial parent participant interview protocol

7,520 sample members of planned 13,800 total study sample (5,830 treatment group members and 1,690 control group members; about 1,253 total per site.) Of the 13,800 noncustodial parents in the total study sample, 65 percent are assigned to the treatment group and 35 percent are assigned to the control group as per the evaluation’s random assignment ratio. The 5,830 and 1,690 figures provided above are ased on an estimated 65 percent of treatment group members and 35 percent of control group members that will have contact with the child support agency following study enrollment (based on currently available data). Of the 13,800 total study sample, 8,970 noncustodial parents will be assigned to the treatment group based on the evaluation’s RA ratio and about 65 percent of these 8,970 noncustodial parents (5,830) will have contact with the child support agency following their enrollment. Instrument 4 is designed only for study sample members who have had contact with the child support agency following their enrollment into the study, as noted in B4 below.

180 interview respondents (30 per site):

  • 120 PJAC treatment group

  • 60 control group

Random number generator

Instrument 5: Staff survey

150 case managers (25 per site):

  • 30 PJAC case managers

  • 120 non-PJAC case managers

60 case managers (10 per site):

  • 30 PJAC case managers

  • 30 non-PJAC case managers

PJAC case managers: NA

Non-PJAC case managers: Random number generator

Instrument 6: Staff time study

180 child support staff (30 per site):

  • 60 PJAC project directors and case managers

  • 120 non-PJAC project directors and case managers

90 child support staff (15 per site):

  • 60 PJAC project directors and case managers

  • 30 non-PJAC project directors and case managers

PJAC project directors and case managers: NA


Non-PJAC project directors and case managers: Random number generator

Instrument 7: Custodial parent interview protocol

11,280 custodial parents (1,880 per site; noncustodial parents average about 1.5 cases each)

180 custodial parents (30 per site):

  • 120 PJAC treatment group

  • 60 control group

Convenience


Instrument 3 requires a convenience sample among non-PJAC child support staff members because there are more non-PJAC staff at each child support agency than could be interviewed during in-person visits. Additionally, the qualitative information collected is not meant to be representative and thus a convenience sample of those available on the days of scheduled visits reduces burden placed on the population.


Custodial parent interview respondents (Instrument 7) will be recruited through a convenience sample because custodial parents are not members of the PJAC study sample; noncustodial parents were formally enrolled into the study and are covered by the MDRC IRB’s waiver of informed consent. Since we do not have access to custodial parent contact information, custodial parents associated with both the PJAC treatment and control group will be recruited based on child support caseworker identifications of those who have been involved in their corresponding noncustodial parents’ cases. Though the views of this convenience sample will not be representative of the larger universe of custodial parents tied to noncustodial parents in the study (i.e., they will probably be much more actively involved in their child support cases), caseworkers can provide appropriate context regarding the role played by less active custodial parents to contextualize our understanding of the custodial parent role.


For instruments 4 and 7, we anticipate having limited contact information for many participants and encountering some disinterest in completing the interview, particularly on the control group side. Noncustodial parent interview respondents (Instrument 4) will be recruited based on the results of a random number generator. Interviews will be scheduled and completed until the target number of interviews with treatment and control NCPs within each site is achieved. The study team will cease interviewing once the target number of interviews is achieved.



B3. Design of Data Collection Instruments

Development of Data Collection Instrument(s)

Each of the seven PJAC data collections is intended to help the evaluation team meet one of the three broad objectives for the study, as described in Table B.2.


Table B.2

Information Collection Instrument

Study Objective Met by Instrument

Previously Approved Request (OMB No. 0970-0505, Approval Date 1/23/2018)

Instrument 1: Staff data entry on participant baseline information

  1. Implementation

  2. Impact

Instrument 2: Study MIS to track receipt of services

  1. Implementation

Instrument 3: Staff and community partner interview topic guide

  1. Implementation

Current Request

Instrument 4: Noncustodial parent participant interview protocol

  1. Implementation

Instrument 5: Staff survey

  1. Implementation

Instrument 6: Staff time study

  1. Implementation

  1. Cost benefit

Instrument 7: Custodial parent interview protocol

  1. Implementation


When developing the information collection instruments, the evaluation team focused on reducing redundancy and ensuring that each data collection instrument only includes items that will allow the evaluation team to meet the study objectives. The evaluation team considered the PJAC theory of change, evaluation design plan, and availability of information in administrative data sources. Unless information was needed from multiple perspectives (e.g., perceptions of the program from the noncustodial parent, custodial parent, and staff vantage points) or crucial for the impact analysis and not reliably collected from administrative data sources (e.g., baseline data elements collected in Instrument 1), items are not duplicated across instruments or administrative data sources.


Instrument 5 is the only instrument that includes items drawn from other sources; specifically, it includes validated scales from the Texas Christian University Survey of Organizational Functioning. It also includes questions adapted from surveys used in the Child Support Noncustodial Parent Employment Demonstration (OMB No. 0970-0439), Building Bridges and Bonds Evaluation (OMB No. 0970-0356), as well as the Center for Court Innovation Procedural Justice Training Survey. (Instrument 5 is annotated to describe which questions originate with other sources.) All other instruments and all other Instrument 5 questions were designed specifically to meet the PJAC study objectives.


The evaluation team plans to pretest Instruments 5 and 6 prior to OMB approval with nine or fewer individuals, then adjust the instrument design accordingly in order to minimize instrument- and mode-based measurement error. If the pretesting results warrant changes to the instruments, the changes will be submitted to OMB as a nonsubstantive change request, as appropriate. The evaluation team pretested a participant survey that was included in the original information collection request (approved January 2018) but later dropped from the evaluation plan. The results from the pretest of the participant survey informed the design of Instruments 4 and 7. In addition to pretesting, the evaluation team will do the following to reduce measurement error: (1) properly train all staff who are entering data or administering interviews to reduce interviewer measurement error, (2) provide assurances of information privacy and offer electronic, self-administered instruments where possible in order to reduce social desirability bias, one source of respondent error, and (3) encourage ongoing rather than delayed bulk data entry and limit interview and survey items to recent and/or easily memorable events in order to reduce recall bias, another source of respondent error.



B4. Collection of Data and Quality Control

For all data entry instruments (Instruments 1 and 2), child support agency staff enter the data into the study MIS. Staff using the MIS received extensive training on navigating and entering data into the MIS. The evaluation team monitors data entry on a regular and ongoing basis, and the OCSE program management team reviews data reports containing summary statistics (no personally identifiable information is included) to analyze anomalies, then both teams provide feedback on apparent issues on a monthly and as-needed basis.

For all interview-based instruments (Instruments 3, 4, and 7), the evaluation team will collect the data via individual phone or in-person interviews. The evaluation team will participate in interviewer training to ensure quality and consistency. For instrument 4, participants will be randomly selected from among those who have engaged with the child support agency in some way, either through participation in PJAC services among program group members or responsiveness to contempt proceedings among control group members. For instrument 7, a convenience sample of custodial parents will be drawn based on child support case manager identifications of those who have been involved in noncustodial parents’ cases. (Sampling method described in section B2.) With the custodial parent’s permission, case managers will share their contact information with the research team. The research team will then reach out to the custodial parent to inquire as to whether they would like to participate in the interview. If the custodial parent responds in the affirmative, the research team will schedule an in-person interview or conduct the interview over the phone, completing a full informed consent process before interviewing begins.

For the staff survey and time study (Instruments 5 and 6), the research team will administer the instruments electronically to all PJAC case managers and project directors, respectively, and a randomly selected group of non-PJAC case managers and project directors, respectively. As PJAC staff are very few in number, we believe it is necessary to administer these instruments to all staff in order to produce reliable measures, maintain participant privacy, and reduce noise in descriptive measures. To maximize data reliability for Instrument 5, the evaluation team will incorporate validation checks into the programmed instrument as well as pull interim data files to conduct extensive quality control checks. This will allow the team to identify and correct any issues early on as well as throughout the fielding window. For Instrument 6, the evaluation team will similarly incorporate validation checks into the pre-structured Excel spreadsheet, as well as thoroughly examine completed instruments for logical inconsistencies and outliers, following up with respondents as necessary to confirm accurate data entry.



B5. Response Rates and Potential Nonresponse Bias

Response Rates

The instruments in this submission for which response rate calculations are relevant are the staff survey (Instrument 5) and the staff time study (Instrument 6). Instruments 1, 2, and 3 are applicable to the universe of respondents. Instruments 4 and 7 (noncustodial and custodial parent interviews) are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported for these instruments.


Instruments 5 and 6: The unit response rate will be calculated as the number of completed instruments divided by the number of fielded instruments, converted to a percentage. The item nonresponse rate will be calculated as the number of responses to a question divided by the number of times that question was asked, converted to a percentage.


For both instruments, we anticipate response rates of 85 percent or above based on similar efforts with similar populations (parents involved in the child support program) completed previously as part of the Child Support Noncustodial Parent Employment Demonstration (response rate of 87 percent and 84 percent for the first and second staff surveys, respectively; OMB No. 0970-0439) and the Enhanced Transitional Jobs Demonstration (the four child support sites had an 87 percent response rate; OMB No. 1205-0485). Given the purposes of these two instruments (for the staff survey, to obtain representative, descriptive quantitative information regarding program implementation, for the staff time study, to obtain representative averages of time spent on various tasks for cost estimate purposes), we believe the anticipated response rate of 85 percent to be adequate. To maximize response rates, the evaluation team will create electronic instruments and send multiple reminders.


Instruments 4 and 7: Interview data are not intended to be representative in a statistical sense, in that they will not be used to make statements about the prevalence of experiences in the PJAC population. However, it is important to secure participants with a range of background characteristics in order to capture a variety of possible experiences with PJAC services. The research team will use all available contact information to reach out to parents and will be proactive in following up/making multiple contact attempts. The team will also be flexible, making calls and conducting interviews at times and places most convenient for potential interviewees.


Nonresponse

We do not anticipate any nonresponse bias or item nonresponse issues for the staff survey or staff time study (Instruments 5 and 6). However, we will still examine nonresponse bias. If treatment group case manager staff survey response rates are substantially higher than control group case manager response rates, this may have implications for the representativeness of our survey sample; thus, we may interpret results with more caution. However, we will not impute or weight data.


To contextualize the interpretation of qualitative interviews (Instruments 4 and 7), we will examine patterns in refusal using the information available to us at respondent recruitment. This will include whether case managers serve treatment or control group members and, in the case of non-custodial parents, whether interviewees are treatment or control group members.



B6. Production of Estimates and Projections

Instrument 1 collects baseline data that will be used as covariates in the impact model, but no other instruments will collect data that will be used in inferential statistical tests. The quantitative data collected in Instruments 1, 2, 5, and 6 will be used to produce descriptive statistics to describe and assess program implementation and costs. These descriptive statistics will be produced for external release. The data will not be used to generate population estimates, either for internal use or dissemination. Instead, data will be used to generate estimates for the six participating demonstration sites only.

The research team will estimate regression models, measuring outcomes based on administrative data, in the following form, using ordinary least squares (OLS):


Yi = α + βPi + δXi + εi

where

Yi is the outcome measure for sample member i;

Pi equals 1 for treatment group members and 0 for control group members;

Xi is a set of background characteristics for sample member i; and

εi is a random error term for sample member i.


The above model presents the impact estimation method for an individual site. For the pooled analysis, site-level fixed effects will also be included as covariates to account for differences between sites in their program implementation and local contextual characteristics. The coefficient β is interpreted as the impact of the PJAC project on the outcome. The vector of regression coefficients, δ, reflects the influence of background characteristics.


In accordance with the ACF Evaluation Policy’s principle of Transparency, OCSE will pre-register the study with Open Science Framework, and a full analysis plan for the impact and benefit-cost studies will be publicly posted on this site prior to the start of data analysis.


B7. Data Handling and Analysis

Data Handling

Throughout the data collection process, the data team will perform extensive data quality checks to identify duplicate records, missing data, coding errors, out-of-range or unexpected values, and internal inconsistencies prior to analyzing quantitative data from Instruments 1, 2, 5, and 6. The team will then (1) work with data providers to address data issues when appropriate and necessary, (2) make decisions about how to systematically handle any remaining irregularities in the data, and (3) create data files that include analysis measures that incorporate any corrections. The baseline data collected in Instrument 1 will be used to create covariates for the impact analysis.


Regarding qualitative interview data from Instruments 3, 4, and 7, the evaluation team will take detailed notes during each interview, which will be written up with gaps filled in using interview recordings shortly after each site visit or interview. A small team will code the data using a codebook and will meet regularly to discuss the process; inter-rater reliability checks and reviews of code application will ensure consistent coding among team members.


Data Analysis

After the quality checks and data processing described above, the data team will produce descriptive statistics summarizing the quantitative data collected in Instruments 1, 2, 5, and 6. Aside from the baseline data collected in Instrument 1, no other instruments will collect data that will be used in inferential statistical tests.


After coding the data as described above, the evaluation team will categorize similar codes into themes, which will indicate the main qualitative findings.


The information collected in these instruments will be used as follows:

  • The covariates created based on Instrument 1 will be used in a linear regression model that estimates PJAC’s impacts on outcomes based on administrative data sources.

  • The descriptive statistics produced using data collected in Instruments 1, 2, 5, and 6 will be used in conjunction with the results of the qualitative data analysis (using Instruments 3, 4, and 5) to describe and assess program implementation.

  • The information collected in Instrument 6 will also be used to estimate the program’s costs.


Data Use

Each public report that uses data from these information collections will include information about how the data were collected, limitations, and guidance on how to interpret the findings. Currently, there are no plans to archive the data collected in this study.



B8. Contact Person(s)

The following individuals can answer questions about the statistical aspects of the research or are leading the data collection, processing, and analysis efforts for the evaluation.


Dr. Cynthia Miller

PJAC Evaluation Impact Adviser and Senior Fellow

MDRC

Cynthia.miller@mdrc.org


Cindy Redcross

PJAC Evaluation Project Director and Deputy Director, Youth Development, Criminal Justice, and Employment Policy Area

MDRC

Cindy.redcross@mdrc.org


Melanie Skemer

PJAC Evaluation Project Manager/Impact Lead and Research Associate

MDRC

Melanie.skemer@mdrc.org


Louisa Treskon

PJAC Evaluation Implementation Lead and Research Associate

MDRC

Louisa.treskon@mdrc.org


Mary Farrell

PJAC Evaluation Benefit-Cost Lead and Executive Vice President

MEF Associates

Mary.farrell@mefassociates.org


Instruments

Instrument 1: Staff data entry on participant baseline information

Instrument 2: Study MIS to track receipt of services

Instrument 3: Staff and community partner interview topic guide

Instrument 4: Noncustodial Parent Participant Interview Protocol

Instrument 5: Staff Survey

Instrument 6: Staff Time Study

Instrument 7: Custodial Parent Interview Protocol

Attachments

  1. PJAC 60 Day Federal Register Notice

  2. MDRC IRB Approval Letter

  3. Parent Interview Announcement Letter

  4. Parent Interview Telephone Script

  5. Parent Interview Thank You Letter

  6. Staff Survey Reminder

  7. Staff Time Study Reminder

1Bradley, E. G. (2013, September). The Case for Procedural Justice: Fairness as a Crime Prevention Tool. Retrieved from Community Policing Dispatch: http ://cops .usdoj.gov /html /dispatch /09 -2013 /fairness_a s_a_crime_prevention tool .asp

The five key elements of procedural justice as applied to the child support context are:

  • Voice and Participation – the parents’ perception that they have had the opportunity to tell their side of the story and that the decision-maker has taken the story into account in making the decision;

  • Neutrality of the Process – the parents’ perception that the decision-making process is unbiased and trustworthy;

  • Respect – the parents’ perception that the child support program treats them with dignity;

  • Understanding – the parents’ perception that they understand the process and how decisions are made;

  • Helpfulness – the parents’ perception that the child support program is interested in their personal situation to the extent the law allows.


2The sites are 1) Maricopa County, Arizona; 2) Riverside and San Bernardino Counties, California; 3) Muskegon County, Michigan; 4) Franklin County, Ohio; 5) Stark County, Ohio; and 6) Newport News and Richmond Districts, Virginia.

3Funding Opportunity Announcement HHS-2016-ACF-OCSE-FD-1171. https://ami.grantsolutions.gov/files/HHS-2016-ACF-OCSE-FD-1171_0.pdf

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-16

© 2024 OMB.report | Privacy Policy