WIOA Common Performance Reporting
30-Day FRN Public Comment and Agency Response
ICR REFERENCE # 201604-1205-008
OMB Control No. 1205-0526
Expiration Date: 08/31/2019
Departments of Labor and Education
Information Collection Request
Workforce Innovation and Opportunity Act (WIOA)
Common Performance Reporting
Summary of 30-day Federal Register Notice (FRN) Comments and Responses
Comments and Responses
General Comments:
After reviewing all of the documentation published to date regarding the WIOA common performance reporting, a few commenters stated that it is unclear who will be responsible for generating the State level reports. Under the Workforce Investment Act (WIA) process, States uploaded data to the EDVRS website which then generated the reports. The commenters questioned whether that process would continue or whether States would be directly responsible for generating State level reports on their own. The commenters indicated that additional time would be needed prior to full compliance with the reporting requirements if States will be responsible for generating the reports.
One commenter stated that the NPRM is not clear as to whether the Departments expect that Program Performance Reports are to be submitted as a summary analysis of the dataset on participants generated by each program, or simply tabulation of individuals served in different categories.
At least one commenter asked when the initial data collection format (e.g., spreadsheet, comma delimited text file, or software API) will be released for public review and comment. Another commenter expressed general concern about the burden imposed by the reporting requirements and indicated that the additional data collection and verification will create a significant increase in administrative time spent by front line staff and inhibit the counselor/customer relationship. The commenter stated that this will reduce the time available to provide direct services to program participants. The commenter asserted that there is no practical utility to the collection of data more frequently than on an annual basis. For WIOA core programs to accurately match participant level data, the commenter said it would be necessary to have a common data element, such as a unique identifier. The commenter asserted that the quality of much of the data collected is dependent upon the participant’s self-attestation.
Agencies’ Response, June 2016: The actual method of data submission will be detailed in future guidance. For DOL-administered programs, the file submission process will be similar to what it was under WIA, with an electronic system to accept individual files and aggregate reports. The Adult Education and Family Literacy Act (AEFLA) program administered by ED’s Office of Career, Technical, and Adult Education (OCTAE) will continue to collect information through its electronic data submission process. The VR program, administered by the Rehabilitation Services Administration (RSA), will utilize a secure file transfer process. The process is outlined in the VR program specific ICR.
The data collection instrument – the Participant Individual Record Layout (PIRL) – will be collected in a comma delimited format for the DOL-administered programs. The Vocational Rehabilitation (VR) program data collection instrument – the RSA-911 – will also be collected in a comma delimited format. As in the past, DOL and ED’s Rehabilitation Services Administration (ED/RSA) will each collect an individual record file on each participant for the titles I and III and the VR programs, just as they did prior to the enactment of WIOA. States are required to submit the PIRLs to DOL and ED/RSA quarterly, with a unique identifier that allows DOL and ED/RSA to match repeat or duplicate customers within each program without having to use Social Security Numbers (SSNs). States will continue to submit aggregate level data to ED/OCTAE for the AEFLA program, just as they did prior to the enactment of WIOA.
In the development of the data elements, the Departments considered the statutory requirements in the Rehabilitation Act of 1973, as amended, as well as those set forth in section 116 of WIOA for the common performance accountability system. Most of the data elements contained in the RSA-911 are explicitly required, whereas others are developed as necessary components of the calculation of performance indicators and/or report items. The data elements contained in the PIRL are required by WIOA. The VR program operates and is funded on a Federal fiscal year (October 1 through September 30) basis pursuant to sections 110 and 111 of the Rehabilitation Act, whereas the other five core programs of the workforce development system operate and are funded on a Program Year basis (July 1 through June 30). Because the WIOA program year and the Federal fiscal year are offset by one quarter (July 1 through September 30), RSA needs quarterly data submissions so that it can draw comparisons between both program and fiscal years. If the RSA-911 was collected less frequently than quarterly, it would be incompatible with the performance reporting template required under title I of WIOA jointly developed by the Departments and would necessitate that VR agencies track and report the same data using two different reporting calendars. In addition, several of the performance indicators – particularly the three employment-related performance indicators – require data that are based on specific quarters after a participant exits a core program. Therefore, by RSA revising the RSA-911 report to a quarterly reporting system, the burden imposed on States is minimized significantly.
The Department of Labor has determined that collecting data via quarterly reports has several benefits, including:
Access to more timely data to respond to data requests;
Access to more timely data for providing technical assistance, implementing corrective action plans, etc.;
Identifying problems (with data collection or the programs themselves) sooner, rather than later;
Provides more data points for more precise evaluations;
Provides more data points improve the quality of the statistical model; and
Helps to identify seasonal trends in the data (participation, performance, or otherwise).
Under WIOA, programs will be able to report participants who are also receiving services from other core programs. However, there will not be a “universal” unique identifier across all core programs for each participant. The participant will be asked to identify what other WIOA programs currently provide them services. The costs included in the Supporting Statement are averages and, therefore, may be more or less than an individual agency’s actual cost.
No change has been made in response to these comments.
Transition to New Reporting Requirements -- Burden:
A number of commenters asserted that start-up costs for the joint performance accountability system have been underestimated. Two commenters believe the costs have been underestimated for three reasons: (1) the specification does not match the report sample; (2) the specification has many cells with incomplete or missing data; and (3) the specification, when compared to the ETA 9169 WIOA Statewide and Local Performance Report, does not have the same calculations for the same reportable items. Another commenter asserted that the increased cost of $544,668 does not include any expenses associated with staff training to collect the new and modified data elements. Furthermore, it does not include staff time needed to collect new and modified data elements and review existing data on all current, open VR cases.
Another commenter expressed concerns about the time estimates listed in tables 5 and 6. Specifically, the commenter indicated that the time estimates in table 5 for participants to provide the information and the time estimates in table 6 for staff to collect the information provided are both under-estimated by half.
A few commenters mentioned that there are two reporting systems that overlap and run simultaneously, namely the reporting requirements for both WIA and WIOA. The overlap of these reporting requirements, according to the commenter, requires additional maintenance and resources that are not accounted for in the burden estimates.
One commenter specifically asked for clarification on which reports will replace the ETA 9085.
Another commenter expressed concern about the implementation of the reporting requirements, especially for the VR program, since the reporting requirements are significantly different for that program than reporting requirements in the past. The commenter said the VR agencies need time to implement changes in data collection methodologies and systems. A different commenter expressed serious concerns about the requirement to report VR data quarterly, saying such reporting adds no value for the proper performance and functions of the VR agencies.
Finally, a commenter pointed out that requiring narrative reporting necessitates that local-level staff pull and analyze data and manually generate required report content, thereby requiring text entry. Consequently, aggregating a narrative report and generating Pay-for-Performance (P4P) numbers is not possible.
Agencies’ Response, June 2016: It is important to remember that the total burden is split across several ICRs. For example, there is associated burden included in this Joint Performance ICR for the joint elements under Wagner-Peyser, as well as additional burden accounted for under the DOL-only ICR. The current Wagner-Peyser ICR also has recently been renewed, and the WIA ICR 30-day public comment period closed on June 30, 2016, continuing the requirements for those reporting packages as well.
The burden associated with reporting under WIA requirements after July 1, 2016, involves keeping a data file for all individuals who exited on or before June 30, 2016 and matching those individuals to wage records to produce a “closeout” WIA report in late 2017. There is no expectation of dual reporting of participants under WIA and WIOA. The Departments will address some of the other comments under other more specific related topical headings.
The revised Supporting Statement includes both staff training costs and the staff time needed to collect new and modified data elements for existing participants.
The Departments will provide joint guidance regarding the timing for the submission of reports, as well as guidance on certain new data elements (e.g., post-exit employment) in the near future. ED will provide guidance on certain new reporting requirements relevant to the VR program (e.g., retroactive reporting) in the final RSA-911 documentation.
The response above regarding quarterly reporting requirements and the statutory provisions for collection of data elements is also helpful in responding to these comments. As indicated in the Joint Performance ICR responses to comments, the Departments will also be issuing additional guidance related to the use of UI wage data records.
Regarding the Pay-for-Performance report, there is no requirement that each local area produce its own report. Rather, the state would provide one cumulative report for all Pay-for-Performance contracts within a state. If a state is using one case management stateside system, as is recommended by ETA, this disaggregation can occur at the state level without adding burden to the local areas.
Definitions:
A few commenters sought clarification with respect to the definitions of “participant,” “reportable individual,” and “exit.” Specifically, one of these commenters wanted confirmation that: (1) an individual who receives staff-assisted supportive services, after being determined eligible for a program, is a “participant” even if he or she has not received other staff-assisted services; (2) receipt of continued staff-assisted supportive services (e.g., career services or training services), upon the completion of other staff-assisted services, by a “participant” means the participant has not yet met the definition of “exit;” and (3) a “reportable individual” who meets the definition of “participant” can still be considered a “participant” even if he or she refuses to provide all of the data elements requested on the PIRL of the Joint Performance ICR, such as those data elements related to “barriers to employment.”
Another of these commenters requested clarification on: (1) how “reportable individuals” are to be reported on the Joint Performance ICR PIRL because the “reportable individual” column is blank; (2) which data elements on the PIRL must be reported in order for an individual, particularly one who accesses services via technology, to count as a “participant,” rather than a “reportable individual;” and (3) what constitutes significant “staff-assisted” services and when that level of service is achieved.
In order to ensure clear and accurate counts, as well as ensure consistency of the data among all programs, one commenter recommended uniform definitions, including a definition of “staff-assisted services.” This commenter also suggested that all participants receiving services from more than one program funded under titles I and III should be co-enrolled. Another commenter requested confirmation that the Departments’ intent for states and locals is to NOT provide Staff-Assisted Career and Training services to individuals seeking those services if they decline to provide “Participant-required information”. Specifically, answering questions regarding barriers to employment they may have – especially in those instances where the PIRL does not allow for a “Participant did not self-identify” answer. If this is not the Departments’ intent, the commenter requested that the Departments clearly explain their intent.
One commenter strongly opposed the Departments’ decision to allow duplicate counts of participants who exit a program more than once in a particular program year. This commenter was especially concerned about the impact this decision would have on the AEFLA program, which has not historically permitted duplicate counts of its participants. Another commenter sought clarification on how the calculation would be done if a participant exits a program twice in a program year and attains a credential prior to the second exit. The commenter believes that the participant would be counted twice in the credential attainment performance standard and only once in the numerator for the credential attained.
Agencies’ Response, June 2016: The Joint Final Rule and Final Joint Performance ICR define “participant,” in short, as an individual who has satisfied all programmatic requirements for the receipt of services, such as eligibility determination, and who receives services other than “self-services” and “information-only services.” Both the Joint Final Rule and the Final Joint Performance ICR focus on the demarcation provided by “self-services and information-only services” to distinguish between a “reportable individual” and a “participant,” rather than on the nature of “staff-assisted services” as was done in the Notice of Proposed Rulemaking (NPRM) and proposed Joint Performance ICR. Neither the Joint Final Rule nor Final Joint Performance ICR defines “participant” in terms of the receipt of staff-assisted services. With respect to the core programs authorized under titles I, II and III, a “participant” “exits” a program, in short, when 90 days of no services have lapsed and there are no other services planned; a “participant” “exits” the Vocational Rehabilitation (VR) program when the individual’s VR record of service is closed in accordance with VR program requirements. If a participant continues to receive staff-assisted career services or training services, the participant would not satisfy the definition of “exit” for any of the core programs because the participant continues to receive services other than self-services or information-only services. As such, the individual would continue to be an active “participant.” While the “barriers to employment” elements are required to be captured by WIOA, the Departments recognize that some of these PIRL characteristics may not be able to be captured in all cases, hence the ability to leave data elements blank in some cases. Refusing to provide characteristic information will not preclude the participant from receiving WIOA services in any core program because program eligibility is not dependent on the provision of such information.
The Departments have identified the data elements required for reportable individuals in the program-specific ICRs by placing an additional column in the PIRL to identify which data elements must be reported for “reportable individuals.” These data elements apply to titles I, III and IV. The AEFLA program may expand its title II-specific ICR to collect data on reportable individuals in the future. The Joint Final Rule clarifies that a determination of whether an individual is a “participant” is based on the services received, not the information the individual provides. The Departments will provide more guidance on when an individual receives services that would constitute services other than self-services and information-only services, thereby satisfying the definition of “participant.”
The decision to utilize duplicated counts in AEFLA was made to support common definitions across all core programs. The title II-specific ICR is reformatted to allow more than one program entry and exit in a program year by a participant (each of these periods from entry to exit is referred to as a “period of participation”) to be recorded in a manner that is as undisruptive as possible to the AEFLA program. Core indicators will be calculated based on each period of participation, and will include a participant who may have multiple exit dates within a program year for indicators that are exit based. A participant who may exit and reenter in the same program year will be counted multiple times in the measurable skill gains indicator.
No change has been made in response to these comments.
Common Exit:
A few commenters requested clarification on “common exit.” Of these, one commenter expressed strong support for the common exit strategy and encouraged the Departments to allow States the flexibility to implement such a strategy for all core programs, not just the DOL-administered core programs authorized under titles I and III of WIOA. The commenter asserted that allowing such strategy for all core programs would ensure data comparability and strengthen the vision of WIOA for shared accountability and alignment of services and data. This commenter further expressed strong support for the Departments to consider permitting States to implement a process for integrated “periods of participation” (i.e., when a participant enters and exits a program more than once in a program year) with common exit, thereby ensuring shared accountability across all services and reducing duplication of data.
Another commenter requested that the Departments issue more guidance and technical assistance in order to explain how “exit” should be used for those programs operating under a common exit policy versus those programs not included under a common exit policy. The commenter suggested that the Departments should consider eliminating the exit box (quarter) and go to an all-outcome-related performance model that records all outcomes across all organizational agencies.
Agencies’ Response, June 2016: The Departments agree with the commenter that WIOA envisions shared accountability and alignment of services and data. To that end, and to ensure comparability of data, all programs reporting under section 116, namely the six core programs plus other DOL-administered programs authorized under WIOA, will be using common definitions of “participant” and “exit.” For purposes of the Final Joint Performance ICR, the Departments have clarified that a participant who enters and exits a core program more than once in the same program year will be counted, for performance calculation purposes on exit-based indicators, each time the participant exits. We further clarified that the period of time for which the participant receives services prior to each exit is commonly known in some programs as a “period of participation.”
In the Joint Final Rule and the Final Joint Performance ICR, States will be permitted to implement common exit policies for all DOL-administered programs, both the core and required one-stop partner programs authorized under WIOA. If a State chooses to implement a common exit policy for some or all of the DOL-administered programs, the policy must require that the participant is reported as exiting, for performance accountability purposes, only once the individual has exited all programs from which he or she was co-enrolled. The Departments are allowing States the flexibility to implement common exit policies for DOL-administered programs because some States had implemented common exit for these programs under the WIA, and the Departments wanted to encourage continuation of such practice. As the Departments noted in the comment and analysis summary for the Joint Performance ICR 30-day Notice published on April 26, 2016, to require, or even permit, such integration between the DOL-administered and ED-administered core programs is not practical because States typically have not developed data management systems that integrate DOL and ED programs. Although we understand the commenter’s desire to integrate all core programs through a common exit strategy, we also are mindful of the many, equally strong, recommendations received in response to the NPRM and the proposed Joint Performance ICR to not require or permit common exits for all core programs. The Departments also are mindful that a common exit strategy would be an entirely new and untested concept for the AEFLA and VR programs. Therefore, although common exits will not be permitted for the AEFLA and VR programs, ED intends to study the feasibility of a common exit strategy for these programs. Furthermore, the Departments intend to issue additional guidance regarding co-enrollment and common exits. In response to these comments, the Departments have revised the Joint Performance ICR to permit States to include other required one-stop partners that are administered by DOL to be included in a State’s common exit policy if a State chooses to implement such a policy.
Co-Enrollment – Unique Identifier:
One commenter sought clarification regarding co-enrollment of participants with multiple core programs and the programs’ use of a unique identifier for each of those participants. Page 62 of Appendix A makes clear that co-enrollment data will be aggregated at the State level for the AEFLA program, but will be aggregated at the Federal level for the titles I and III and VR programs. The commenter specifically sought guidance on how this aggregation of data would be possible if all the programs do not use the same unique identifier for each participant who is co-enrolled in multiple programs.
Agencies’ Response, June 2016: While not required by statute or regulations, States may use the same unique identifier across all core programs, so long as all Federal and State laws and regulations governing the confidentiality of personally-identifiable information (PII) are satisfied. The Departments will address other methods for determining and tracking participants who are co-enrolled in future joint guidance. No change has been made in response to this comment.
Assurance of Confidentiality:
One commenter pointed out a potential error in the Supporting Statement to the Joint Performance ICR in which the Departments state that titles I-III will not collect SSNs. The commenter believes this statement is inaccurate because there would be no way to tie a participant of more than one core program together for reporting purposes if SSNs are not collected. The commenter requested clarification.
Agencies’ Response, June 2016: States will be expected to collect SSNs in order to conduct wage record matching. However, there is no expectation for States to submit those SSNs to DOL. OCTAE, which administers the AEFLA program, does not collect, and States do not submit, individual records for AEFLA participants. For purposes of the AEFLA program, States maintain individual records that contain PII. In accordance with section 501(b)(2) of WIOA, RSA has determined that the collection of SSNs is necessary to the proper administration and program management of the VR program, authorized by title I of the Rehabilitation Act of 1973 (Rehabilitation Act), as amended by title IV of WIOA. Section 131(a)(B)(ii) of the Rehabilitation Act requires the Secretary of Education to enter into a memorandum of understanding for the purposes of exchanging data of mutual importance with the Social Security Administration. No change has been made in response to this comment.
Exclusions from Performance Accountability – Participants without Valid SSNs:
The Departments received several comments regarding whether certain populations or data elements would be “excluded” from the calculations for purposes of the performance accountability system. In particular, these commenters expressed concerns related to individuals who do not provide valid Social Security Numbers (SSNs), regardless of the reason. One of these commenters strongly recommended that the lack of a valid SSN should constitute an “other reason for exit” in the PIRL and that these particular participants should be excluded from performance outcome calculations. In the alternative, this commenter said that the Departments should allow States to use a system similar to that used by the AEFLA program. For purposes of that program, States use the National Reporting System for Adult Education (NRS), which takes the performance rate for those participants who are found to have achieved an outcome (e.g., employed in the second quarter after exit) and use it to project the number of participants achieving the outcome out of the universe of participants who are not available for wage record matching because of a lack of a valid SSN. This commenter explained that the net effect of either approach – exclusion from the calculations or the NRS approach – would be the same, specifically participant records without valid SSNs would not: (1) be treated as failures; (2) negatively impact reportable performance; and (3) put States and local Boards at risk of sanction or other corrective action.
Multiple commenters expressed concern regarding the increased burden on the States’ staff and resources to conduct the follow-up activities. One of these commenters stated that if the Departments do not adopt one of the alternatives described above, States will be forced to make a decision between spending money on service delivery and risk sanction and potential loss of funding or spend less on service delivery and more on administration doing follow-up activities. One commenter encouraged the Departments, regardless of their decision on the above-described alternatives, to provide a means to track and report that information in the PIRL in order to understand the scope and potential impact of the issue.
Another commenter agreed that participants with pseudo SSNs should be included for purposes of performance accountability calculations for the title I core programs, but should not be included in the calculations for the title III program. The commenter noted that Wagner-Peyser is not a case managed program and that, therefore, such a requirement would put a large burden on the program to successfully contact participants and gather supplemental data. Such burden could negatively impact the program’s overall performance and pose a significant cost burden.
One commenter encouraged the Departments to allow programs that could demonstrate due diligence at getting SSNs, but were unable to get the information, to exclude these participants from performance calculations. In so doing, the programs would not be penalized for not getting the information necessary to match wage records. Finally, one commenter noted that code 98 (not a valid SSN) was removed from the list of “other reasons for exit” (code 923).
Agencies’ Response, June 2016: Because the number of participants with missing or non-valid SSNs could be significant for some programs, completely excluding these participants from program accountability measures could result in a significant gap in performance data and, therefore, such policy would not be aligned with the requirements of WIOA to obtain outcomes on all participants. For this reason, the Joint Final Rule and the final Joint Performance ICR do not permit participants without a valid SSN to be excluded from the performance accountability calculations, but do permit States to rely on supplemental information to satisfy the performance accountability requirements when a valid SSN is not known. It is important to note that some States have developed highly reliable logarithms to perform wage data matches using a combination of other personally-identifiable information (PII). Such data match logarithms eliminate the need for SSNs and reduce the burden associated with more labor-intensive follow-up methods.
The performance calculation formulas used by the AEFLA program’s NRS were developed to yield valid and reliable data collected through sampling. Under WIA, States that conducted surveys to collect outcome data on large numbers of participants without a valid SSN had the option to use sampling procedures instead of surveying the entire universe of participants. For NRS purposes, participants without valid SSNs were never excluded from the denominator for any indicator or from sampling procedures. States were required to follow up on all participants, including those without a valid SSN. To allow for the possibility that some States might collect data through a combination of universe and representative cohorts for the same measure, the NRS applied the statistical formula to all follow-up permutations, including scenarios that relied on an exclusive universe data match. This approach worked for the NRS under WIA, because it collects aggregate State data. However, it would not be feasible for Federal data systems that collect individual records. To align with performance reporting procedures of other WIOA core programs, the NRS will not allow sampling and thus will not apply the performance calculation methods used for sampling under WIA.
The Departments recognize the concern that some States may incur additional burden to collect supplemental wage data due to missing SSNs. We have accounted for that data collection burden in the regulatory impact analysis (RIA) for the Joint Final Rule. As stated above, we remain convinced, despite the potential burden, such methods are necessary to ensure an accurate reporting of programmatic activities for those programs that might have higher rates of individuals who refuse to provide SSNs.
The Departments understand the commenter’s concern regarding increased burden of reporting on individuals without a valid SSN, including the burden such a requirement imposes on the Wagner-Peyser Act Employment Service program. However, section 116 of WIOA requires that all program participants be tracked for purposes of the performance accountability system and, except for exceptions specified in section 116, there is no statutory basis to exclude participants for particular programs, as the commenter recommends.
The Departments disagree with comments indicating that a State should not be held accountable for its participants if supplemental data collection methods do not produce desired information. Such a policy could create disincentives to implement effective supplemental data collection procedures.
No change has been made in response to these comments.
Exclusions from Performance Accountability – Participants who are Criminal Offenders:
One commenter requested clarification regarding the addition of “criminal offender” to the list of “other reasons for exit.” Specifically, the commenter indicated that “criminal offender” is more of a characteristic rather than a reason for exclusion. The commenter asked whether the “criminal offender” data element is limited to the AEFLA program.
Agencies’ Response, June 2016: The Departments provide detailed guidance on this issue in the Joint Final Rule. The Departments have added such exclusions pursuant to comments we received in response to the Joint NPRM regarding those who are “not in the labor force.” The Departments, in general, disagreed with commenters who requested broad exclusions for purposes of the primary performance indicators for individuals who are not actively looking for work. However, the Departments recognize and acknowledge in both the final regulatory text and preamble for § 677.155 in the Joint Final Rule that there are limited circumstances under which certain participants, such as those who are incarcerated and receiving services under section 225 of WIOA, should not be included in the performance calculations for certain indicators. The Departments have decided to exclude such individuals receiving services under section 225 of WIOA because they do not have the opportunity to obtain employment or participate in education or training programs in the same manner as other participants who are in the general population. The Departments included the exclusion for section 225 participants from all calculations for performance accountability purposes except for calculations specific to the measurable skill gains indicator in § 677.155(a)(2) in the Joint Final Rule. No further change has been made in response to this comment.
Performance Periods for Reporting Purposes:
A few commenters requested clarification regarding which quarters were to be used for reporting periods of participation for purposes of the performance accountability system. These commenters, in particular, questioned the specific quarters that were to be used for the annual report. The commenters suggested there may be an error in the Joint Performance ICR with respect to a lag time in the quarters for some of the performance indicators. Another of these commenters expressed concern that the additional two-quarter lag time could affect the relevance of the data. Still another of these commenters asserted that the reporting period for the employer effectiveness measure does not reach back far enough to accurately capture the employee retention rate.
One commenter also sought clarification regarding the performance periods that should be reported for purposes of the “effectiveness of serving employers” performance indicator. Specifically, this commenter raised two concerns with Appendix D of the Joint Performance ICR:
Employee Retention focuses on remaining employed with the 2nd and 4th quarters post-exit. While the commenter disagrees with this being an exit-based measure, it needs to follow the performance period for the Employed Q4 Post-Exit measure, not the Employed Q2 Post-Exit measure.
It is not clear why Employer Penetration and Repeat Employer Customer, which are “service-based” measures, much like “Participants Served”, need to be reported on a delayed cycle like Employee Retention.
Another commenter identified an apparent error for a performance period date for the Effectiveness in Serving Employers for PY 2018.
One commenter also expressed concerns about the performance periods to be used for reporting performance for the measureable skill gains indicator. Specifically, the commenter stated that a significant majority of individuals captured under the measurable skill gains indicator will be AEFLA participants, with the majority of those doing so through “pre/post-testing” method. Therefore, the commenter suggested that the Departments should develop a lag that would work reasonably well for pre/post-testing. The commenter proposed setting the denominator one quarter prior to the “Participant Served” performance period, but the numerator would be based on any gain achieved prior to the end of the report quarter. The commenter also expressed concern about the impact of enrolling participants late in a program year could have on performance levels and suggested revisions to Appendix D to address these concerns.
Agencies’ Response, June 2016: Some of the lag times shown in Appendix D were inadvertently pushed back an additional quarter which is unnecessary. In response to the comments and further Departments’ review, the Departments revised Appendix D to get the necessary performance indicator data in the quickest way possible. Please see Appendix D for the corrected time cohorts for each indicator.
To align all three sub-measures of this indicator, States should submit data for all parts of the indicator based on the same time period for each.
As stated in the 60-day Notice’s Summary of Comments and Responses, section 116(b)(2)(A)(i)(V) of WIOA makes clear that the measurable skill gains indicator is not limited to those participants who have exited a program. Of all of the primary indicators described at section 116(b)(2)(A)(i) of WIOA, the measurable skill gains indicator in section 116(b)(2)(A)(i)(V) is the only one that refers to participants achieving an indicator “during a program year”. Therefore, participants are included in this performance indicator at the point in which they become a participant.
No changes have been made in response to these comments.
Submission of Performance Reports:
A few commenters expressed concern that States and local areas will not be able to submit the first quarterly report for PY 2016 in November 2016 as expected. These commenters indicated that the delay in the publication of the Joint Final Rule and the final Joint Performance ICRs have delayed States and local areas in revising their data systems to accommodate all of the requirements. Therefore, at least one of these commenters strongly recommended that States be permitted to submit only an annual report for PY 2016 and begin submitting quarterly reports for PY 2017. Another commenter expressed concern that data could be missing from the early reports if additional time is not given to the States for submitting the reports. One commenter questioned the need to negotiate levels of performance for purposes of reporting outcome data for PY 2016 and requested guidance on this issue. No changes were made in response to these comments.
Agencies’ Response, June 2016: The Departments are cognizant of the work required by States to adhere to the WIOA performance requirements and acknowledge that time is needed for the implementation of those data requirements. However, WIOA requires that reporting begin with PY 2016 (which begins July 1, 2016). It is expected that States will be able to meet the performance requirements at differing points throughout the program year. The Departments will work with States on an individual case basis to provide technical assistance with the goal of being able to produce PIRL-driven data files as quickly as possible.
Negotiations of Levels of Performance for PY 2016:
At least two commenters questioned whether States were required to negotiate levels of performance with the Departments for purposes of reporting outcome data for PY 2016 and requested guidance on this issue. One of these commenters asked whether States would be given leniency in meeting negotiated levels of performance in PY 2016 and PY 2017 since performance data in those years will not include all four quarters of data. Another commenter questioned whether there would be negotiations in PY 2017.
Finally, a different commenter requested information about the protocol if the State can demonstrate that the Federal statistical adjustment level is inaccurate due to fields included in the State model which have been left out of the Federal statistical adjustment model.
Agencies’ Response, June 2016: With respect to guidance related to the negotiations process, OCTAE issued guidance in April 2016 applicable to the AEFLA program. OCTAE completed the negotiations process with States by June 30, 2016. DOL issued Training and Employment Guidance Letter (TEGL) 26-15 on June 29, 2016 applicable to the DOL-administered programs and expects to complete the negotiation process with States by August 15, 2016. RSA will issue guidance applicable to the VR program prior to the submission of the Unified or Combined State Plan modifications in 2018, which will be the first year that RSA will negotiate levels of performance with States. As the commenter stated, the Departments recognize that States will not have full performance data reporting in PY 2016 and PY 2017. Levels of performance and sanctions will only be based on the data that can be collected and reported at the time. The Departments will address this issue in future guidance.
The variables included in the Federal statistical adjustment model will be identical to those variables included in State models. In fact, there is only one model framework that will be applied to all core programs federally, at the State level, and locally.
No change has been made in response to these comments.
Eligible Training Provider (ETP) Reporting -- Burden:
Several commenters expressed concerns about the ETP reporting requirements and the burden they could impose. One of these commenters was concerned that the Departments were considering requiring a narrative portion of the Annual ETP Report. This commenter recommended that the Departments optimize their reporting requirements in such a way that they gather maximum information with minimal effort on the part of the ETPs. This commenter expressed concern that ETPs may drop out of the system if faced with burdensome reporting requirements.
The commenter also expressed concern about the burden that could be imposed on ETPs if they were required to do both an annual report under section 116 of WIOA and a recertification report under section 122 of WIOA. The commenter recommended that the two reporting requirements be aligned in order to maximize value with minimized effort.
One commenter encouraged the Departments to use the transition authority under section 503 to delay or phase-in the ETP reporting requirements. In so doing, the Departments would be giving States the time to work with partners in higher education to access individual or aggregate data that ETPs already submit to those partners to minimize duplicative reporting on ETPs.
Furthermore, a commenter said it would be unreasonable to expect an ETP to conduct manual follow-up activities in order to comply with the requirements imposed by the primary performance indicators for those participants who do not provide a valid SSN. This commenter asserted that ETPs receive a small amount of funding under WIOA – sufficient to serve only 5 percent of their trainees and students – and that such manual follow-up requirement would be too burdensome.
Agencies’ Response, June 2016: To be clear, the Departments are not requiring a written report for ETPs, but are requiring such reports for Pay-for-Performance contracts.
With respect to the concerns raised regarding the reporting required by section 116 and the recertification reporting required by section 122 of WIOA, the Departments have addressed these concerns in the final Joint Performance ICR.
Since the ETP report is an annual requirement, the Departments anticipate that States will begin collecting the requisite ETP data as soon as possible, with the first report being due by October 1, 2017.
Because the primary performance indicators – including the employment-related performance indicators – are required for ETP reporting, the SSN is a vital part of conducting the wage match to produce performance outcomes. As with the States themselves, if an ETP does not produce an SSN for wage matching, the State must obtain that SSN through follow-up, or obtain the actual performance outcome information through follow-up means.
No further changes have been made in response to these comments.
ETP Reporting – Miscellaneous:
One commenter recommended that State laws or regulations supersede any requirement that the Departments would impose, if such requirements are more stringent, with respect to the suppression of data to protect the confidentiality of personal information in the case of small sample size for any given data cell.
Another commenter noted that one of the data elements required to be reported in relation to ETPs is the “percentage of participants in a program, authorized under WIOA title I who received training services and obtained unsubsidized employment in a field related to the training received.” The commenter added that Unemployment Insurance (UI) wage data do not accurately identify the occupational title of each individual, outside of data input by title I staff regarding employment outcomes. This measure will not be able to be identified using wage data alone. The commenter was unclear how the ETP report for all students will be able to measure this data without significant burden to the ETP.
This same commenter noted that one of the measures to be included in the ETA-9171 is the percentage of individuals who obtain a secondary school diploma or its recognized equivalent, which would appear to imply that the ETP list must include all high schools and all AEFLA programs and any others that offer high school completion or high school equivalency training. If this is the case, the commenter stated that this would require significant additions to current ETP lists and would require significantly more time to enter and approve these programs.
One commenter expressed concerns about the definition of “type of entity” for purposes of ETPs. A “type of entity” includes:
Institution of Higher Education that only awards, or the majority of credentials awarded are, associates degrees; and
Institute of Higher Education where the majority of credentials awarded are a Community college certificate of completion.
The commenter explained that the proposed distinction is a fluid one in a world of increasing use of technical associates degrees and stackable certificate programs where students earn a series of certificates that cumulate to an Associate’s degree. Given that some institutions might change back and forth across this majority rule from year to year, data collected by it would be of uncertain meaning, particularly for use in identifying trends. Therefore, the commenter cautioned against adopting such a narrow definition of the term when it is not defined in WIOA. Another commenter raised questions about various entities and asked whether they would meet the definition of “type of entity” for purposes of qualifying as an ETP. The same commenter raised a question about an apparent overlap between other definitions related to “program of study” and “by potential outcome.”
A commenter also noted that Program of Study* CIP Code - CIP Code set may be inadequate. Given the extension of ETP-like processes created by sec 122(h), sec 123, and further elaborated in the NPRM, the commenter said it needs to be evaluated whether types of training services not covered in the CIP may be acquired through individualized training accounts (ITAs), or based on an ETP-like performance process, such as entrepreneurial training or pre-apprenticeship. Another commenter indicated that the six-digit (not four digits) CIP code plus the program of study outcome is not sufficient information to identify a single program of study. The commenter suggested that the name of the program of study should be collected along with the name of the ETP. Another commenter stated that the “program of study” codes are ambiguous because the categories are not mutually exclusive. For example, a code 1 “program leading to an industry-recognized certificate or certification” can also be one of the modules in #4 “program leading to an Associate’s degree” and possibly could be confused with codes 6, 8, and 9.
Another commenter expressed concern that the ETP reporting requirements under section 116 of WIOA are inconsistent with the Governor’s authority with respect to ETP reporting and the information that must accompany ETP lists under section 122. The commenter believed it would be more appropriate for the Governor to develop a reporting system and that the section 116 ETP reporting should be based on the system created by the Governor.
Agencies’ Response, June 2016: DOL intends to issue guidance regarding ETP requirements, including requirements for the suppression of data to protect the confidentiality of small sample size of trainees or students in any given data cell.
The DOL-only final regulations at 20 CFR 680.200(d)(1) outline the types of entities qualified to be an ETP: institutions of higher education that provide a program which leads to a recognized postsecondary credential; entities that carry out programs registered under the National Apprenticeship Act (29 U.S.C. 50 et seq.); other public or private providers of training services which may include community-based organizations, Joint labor-management organizations; and eligible providers of adult education and literacy activities under title II of WIOA if such activities are provided in combination with training services described at § 680.350. Community colleges are considered institutes of higher education. Regarding Registered Apprenticeships, WIOA Section 677.230(b) specifies that the registered apprenticeships programs referred to are those registered under the National Apprenticeship Act. Regarding the overlap of program of study, this was intentional. However, a revision was made to the ETP definitions that instruct States to “list all programs that apply”, so States will not have to differentiate only one type. Specific examples that do not fit the majority of cases will be discussed further in ETP-related guidance.
The Departments agree with the commenter that CIP codes are a maximum of six digits, but we are requesting only the first four to help reduce coding burden and to broaden the categories of programs of study. Therefore, only the first four CIP digits must be reported. Use of the established list of CIP codes by ED provides a uniform list from which all participants and case managers may select. With respect to the codes for program of study categories, the coding structure was designed to capture the goal related to a program of study, namely the specific degree or certificate attached to completion of the training program.
The commenter is correct as there is no automated way to determine “training related employment”. As was the case under WIA, case manager intervention is required to determine whether the employment gained was related to the training received.
The Departments want to make clear that there is no requirement to add all high schools and/or AEFLA programs. Rather, States and ETPs have a location in the report template to display the number of secondary school diplomas that were earned.
The Departments believe that while WIOA sections 116 and 122 are related, the requirements within each are different. For example, there are distinct requirements for ETPs to report on under section 116, which is what is included in this ICR. Additionally, there will be more information regarding the ETP reporting requirements in an upcoming amended ICR which will include the form/layout of the final data report.
Reporting on Reportable Individuals:
One commenter supported the Departments’ decision not to capture the Joint PIRL data elements for reportable individuals who have not yet met the definition of a “participant.” This approach will reduce the burden on programs, particularly the title III program.
Agencies’ Response, June 2016: The data collection requirements for Reportable Individuals are unique to each core program and, therefore, will be included in each core program’s ICR. No change has been made in response to this comment.
Joint PIRL -- Pre-Participation Employment/Earnings:
This commenter strongly encouraged the Departments, in response to the 60-day FRN on the Joint Performance ICR, to collect pre-participation earnings information. The commenter believed such information is one of the best predictors of post-participation earnings. In response to that comment, the Departments declined to make the change. The commenter strongly encourages the Departments to reconsider and collect pre-participation earnings information in the Joint PIRL.
Agencies’ Response, June 2016: The Joint PIRL only collects the WIOA-required data elements. However, the Departments agree with the commenter that pre-participation earnings information can provide valuable information on a participant. Therefore, the Departments have revised the DOL program-specific and the VR program-specific ICRs to collect pre-participation wages. No change has been made to the Joint Performance ICR in response to this comment.
Joint PIRL -- Changing Customer Characteristics:
One commenter expressed concern regarding whether data related to changes in customer characteristics should be updated or remain the same as was true at the time of participation in the program. Specifically, the commenter believes that some characteristics, by their very nature should not be updated (e.g., long-term unemployment at application) whereas others should be updated (e.g., disability and employment status). One of these commenters stated that updated data for characteristics such as these are relevant for application of the statistical adjustment model, which is to adjust expectations based on the characteristics of the participants being served. Another commenter supported the collection of data related to all of the barriers to employment.
Agencies’ Response, June 2016: While there are advantages to updating certain variables, the Departments have decided not to update the data after initial collection for two reasons: (1) to be consistent with all data elements (instead of being able to update some variables but not all); and (2) to ease collection burden on States. Due to the length of time participants may be served in certain programs (e.g., the VR program), there are some data elements that may be updated over the period of program participation. These data elements are identified in the VR program-specific ICR.
The Departments concur with the commenter that the data pertaining to all relevant barriers to employment are important in developing expected, negotiated, and adjusted levels of performance. However, when developing the statistical adjustment model, the Departments considered the barriers outlined in WIOA sec. 116 (b)(v)(II)(bb). Initially, no additional barriers were included in the development of the model. As the model evolves over time, the Departments may consider adjustments to the barriers of employment.
Annual Report Specs -- Participants Served/Expenditures/Average Cost:
One commenter identified a potential inconsistency between the plain text specifications and the technical specifications with respect to the “number of participants who received training services” during the reporting period. The plain text specifications describe the data to be collected accurately, but the technical specifications are not written in a way that clearly limits the data to those participants who received the training services during the report period.
The commenter also identified another potential inconsistency with respect to expenditure and average cost specifications. For example, in the 60-day FRN for the proposed Joint Performance ICR, the Departments made clear that the total cost per participant for career services or training services is the total cost per career services per participant or total training service costs per participant. However, according to the commenter, the report specs do not match that guidance.
Agencies’ Response, June 2016: The cost per participant definitions will remain the same in the Joint Performance ICR. However, the Departments will issue additional guidance, including program-specific guidance, regarding what types of services are to be included in the broader training and career service categories. For the VR program, this information is included in the program specific ICR. No change has been made in response to this comment.
Performance Indicator – Credential Attainment -- Inconsistencies and Questions:
Several commenters sought clarification with respect to credential attainment specifications, variable names, data sources, and data element formats. One commenter noted several discrepancies between report and technical specifications for Item 28, Credential Rate Denominator of Appendix C Performance Report Specifications and that these discrepancies are relevant to inclusion and exclusion of types of participants. Another commenter pointed out that an obsolete variable is referenced in the Credential Rate Denominator, Appendix C. Another commenter recommended that federal agencies establish free access to National Student Clearinghouse (NCS) data and ensure that specifications for the date enrolled field be compatible with enrollment data obtainable from NSC.
Agencies’ Response, June 2016: The references to Adult Education, Job Corps, and Youth Build have been removed from technical specification Item 28; the phrase “other than On-The-Job training or customized training” will be added to the plain text report specifications for Item 28 Credential Rate Denominator of Appendix C Performance Report Specifications. The obsolete variable, Enrolled in Post-Secondary Education Program, was renamed as PIRL 1811 “Date Enrolled in Education or Training Program Leading to a Recognized Postsecondary Credential or Employment During the Program (WIOA)”; specifications for the Credential Attainment Rate will be edited to reflect this change in data element name. The Departments will provide future guidance on how States can provide the date enrolled data element. No other changes have been made in response to these comments.
Performance Indicator – Credential Attainment -- Inclusions/Exclusions:
Several commenters pointed to a lack of clarity about certain types of individuals who are not counted in the credential attainment rate indicator. Varied opinions were expressed about whether these groups should be included in credential attainment measures. A commenter expressed concern about including those Enrolled in Postsecondary Education at Date of Program Entry in the measure if the program or system did not fund the postsecondary education; they noted that counting a participant who had received services for only a brief time might not be appropriate. Commenters noted the following exclusions: individuals enrolled in On-The-Job and customized training programs that do not lead to a credential; criminal offenders who exit a program upon obtaining work; and participants who score low levels of literacy (e.g., those with low levels of English proficiency) unless they are enrolled in programs that provide instruction at or above the ninth grade level. Two commenters stated strong support for measuring credential attainment among all exiters and indicated that the Departments’ decision to exclude certain participants creates an indicator that measures enrollee success rates rather than overall credential attainment achieved under WIOA. Additional comments described the policy as creating disincentives for programs to enroll individuals with barriers to employment because placing such individuals in training increases the risk of a lower credential attainment rate.
Agencies’ Response, June 2016: The denominator of the credential attainment rate indicator is defined as the number of participants who exited and were in a postsecondary education or training program during program participation PLUS the number of participants who exited and were in a secondary education program at or above the ninth grade level without a secondary school diploma. The denominator is defined to include whether there is a reasonable expectation of attaining a credential or certificate from that education or training, regardless of who is funding the education or training. Criminal offenders served under section 225 are not included in employment and earnings and credential attainment indicator calculations because they generally do not have reasonable access to programs through which they might attain a credential. Because most apprenticeship programs offer an industry-recognized credential upon program completion, apprenticeships would not be excluded from this indicator. In response to requests to include all participants who have exited in the credential attainment rate indicator calculations, the Departments believe this indicator should only measure those who reasonably have an opportunity to succeed, for instance, as a result of having ready access to programs offering credentials. The Departments believe that including all participants who have exited would, in effect, create an “indicator of funding” and one that measures who can receive training or education. No changes have been made in response to these comments.
One commenter was disappointed that the Departments chose to define credential attainment rate in a way that will provide disincentives for programs to place low-income or lower-skilled participants into education and training. By deciding that the denominator of the credential attainment rate will include only those in education and training, programs will have no incentives to take a chance on individuals with barriers to employment because by placing such individuals in training, the program may risk a lower attainment rate. In the face of this disincentive, programs will likely limit the number of participants they place in education and training to reduce this risk.
The same commenter stated that the Departments’ response uses circular logic to support the decision to not use the statutory description of the credential attainment rate, referring to the Departments’ statement that “the indicator focuses on participants who are enrolled in an education or training program, because the purpose of the indicator is to measure related to attainment of these credentials; therefore, it would not be reasonable to measure credential attainment against a universe that includes other individuals…” The commenter then asserted that the clear intent of the statute is for the term “program participants” to refer to participants in a WIOA program (e.g., title I-b-adult, title II, title IV, etc.), not merely participants in the training component of a program.
Agencies’ Response, June 2016: The Departments disagree with the comment and believe the explanation provided clarifies the position. It is discussed in the preamble to the rule. No change has been made in response to this comment.
Performance Indicator – Credential Attainment -- Requests for Guidance and Technical Assistance:
A commenter requested the code for a completed Apprentice in a trade without formal licensure.
Agencies’ Response, June 2016: This issue will be addressed in future guidance.
Performance Indicator – Measurable Skill Gains (MSG) – Inconsistencies/Questions about Specifications/Codes:
A few commenters sought clarification with respect to specifications, codes, and where to report completion of specific achievements. One commenter noted that references to the “Date of Most Recent Measurable Skill Gain: Performance” still appear in Appendixes B and C, although this variable is no longer allowable as a means of demonstrating a gain. Another commenter asked whether the response value for “Date Enrolled in Education or Training Program Leading to a Recognized Postsecondary Credential or Employment” is a date. Two commenters pointed to a lack of clarity around where to report completion of academic Associate’s or Bachelor’s degrees.
Agencies’ Response, June 2016: The report specifications have been revised. “Date of Most Recent Measurable Skill Gain: Performance” (Item #6) has been removed from specification #30 in the specifications tab of Appendix C; Appendix B has been removed from the final ICR. The name of data element 1811 is “Date Enrolled During Program Participation in an Education or Training Program Leading to a Recognized Postsecondary Credential or Employment.” It is a date field with Data Type/Field Length “DT8“ and Code Value “YYYYMMDD”. Associate and Bachelor degrees would be recorded in data element 1800, “Type of Recognized Credential”.
Performance Indicator – Measurable Skill Gains -- Comparability of Training Gain Measures:
Another commenter pointed to a lack of comparability among training gain measures, noting that a high level of variance among additional reporting elements under the MSG indicator reduces the usefulness of the combined attainment rate.
Agencies’ Response, June 2016: The design of this report is to provide a break-out of each type of gain within the indicator. The sum of each type of gain will show the sum of the total gains in a given area, but the performance accountability of this indicator looks at the number of participants who achieved a gain. No change has been made in response to this comment.
Performance Indicator – Measurable Skill Gains -- Inclusion/Exclusion:
Several commenters expressed concerns about the impact of including or excluding participants from the MSG indicator. One commenter remarked that persons who had completed training but were still receiving program services are excluded from the MSG numerator. Another commenter recommended excluding persons whose time in a training program had been too short to show a measurable gain within a program year. A commenter pointed out that the MSG indicator appears to exclude Vocational Rehabilitation services that contribute to participant success as well as summer youth, entrepreneurial, and pre-apprenticeship training, and another noted that some individualized enhancement paths (e.g., a combination of academic and technical courses and training) would not be recognized as a measurable skill gain.
Agencies’ Response, June 2016: With multiple ways to succeed in this indicator, the participant does not need to be enrolled in training to be included in the numerator of this indicator. The Departments have determined that the specific mention of “during a program year” in the statute means this indicator is measured within a given program year, regardless of the time spent in the program. Regarding apparent exclusions from the MSG indicator, the Departments will release subsequent guidance on all six core indicators, providing a more detailed explanation for each. No changes have been made in response to these comments.
Performance Indicator – Measurable Skill Gains -- Requests for Guidance and Technical Assistance:
Several commenters requested clarification, guidance, and technical assistance. Several commenters stated that title II programs should be allowed to use all five types of MSG available to WIOA core programs instead of limiting these programs to achievement of one or more educational functioning level (EFL) gains. The commenters stated that expanding the use of the MSG indicator would enable the title II program to enjoy benefits of innovative practices and supports to cross-program alignment on integrated education and training (IET) programs that blend title I funds for occupational training and title II funds for contextualized basic skills education bridge programs. Another commenter disagreed, stating that EFLs are the measures most relevant to title II objectives; other types of measurable skill gains are inappropriate or redundant. One commenter noted that a lack of comparability between training gain measures creates an incentive for selective enrollment in easier-to-complete programs (i.e., tracking). Several commenters strongly urged the Departments to (1) issue guidance on eliminating the practices of creaming (focusing service on those most likely to succeed) and tracking; and (2) provide technical assistance to States and local areas to ensure they understand and adhere to these principles.
Agencies’ Response, June 2016: The Departments agree with comments indicating that the AEFLA program should not limit the types of gains that may be used under the measurable skill gains indicator to only the demonstration of an EFL gain. The Departments will issue program-specific guidance regarding the most appropriate measurable skill gains options for the core programs. In response to concerns about tracking, the Departments noted that in order to achieve a measurable skill gain, one must achieve at least 12 academic credits, if the career pathway calls for classroom training. No changes have been made in response to these comments.
Performance Indicator -- Employer Effectiveness -- General:
One commenter noted that it uses a business services measurement system in its database to allow staff to easily enter information, and for leadership to extract reports that show data by date and area. The commenter proposed that some of these measures for services should be used in performance measure six. These services include Business Education, Hiring Event, Labor Market Information, and Training and Retraining, along with a handful of others. For each service, a survey is sent to employers when they access the service. A specific benefit of this model is that it was developed with a focus on learning how to serve employers better and ensure that the right data is captured to drive improvements.
The commenter also questioned how employers were engaged in the development of these three metrics and which of these measures employers believe to have the most value. The commenter suggested that the final ICR should detail how employers were engaged in the development of these measures or clarify that employer input is part of the baseline period between now and PY 2018.
Finally, the commenter suggested that to clarify and support that these measures are reported on separately by each program, option 1 should read: "An employee retention rate of program participants..." rather than WIOA participants.
Another commenter said it was unclear whether the denominator for the Employer Retention Rate is the number of clients employed during the 2nd quarter following exit or the number of clients who exited the program two quarters prior.
Agencies’ Response, June 2016: In the preamble of the Joint Final Rule concerning § 677.155(a)(1)(vi), the Departments acknowledge the development of indicators by States and reiterate that States may test and use any additional measures related to effectiveness in serving employers, and report the results to the Departments. With respect to employer engagement in the development of the measures, the Departments held one town hall meeting specifically dedicated to employers in an effort to gain suggestions and ideas for the development of this indicator of performance. Additionally, we conducted a general town hall meeting in an effort to gather suggestions regarding all indicators of performance. Employers and other stakeholders were also invited to attend webinar sessions designed to gather feedback from participants. Finally, the Departments sought comments during the 60-day comment period and during the 30-day comment period on the ICRs.
The Departments agree with the comment regarding how option 1 should read, and have revised the language accordingly.
The technical specification for this indicator is found in Appendix C and reads as follows: Retention with Same Employer in the 2nd and 4th Quarters After Exit Numerator ÷ Retention with Same Employer in the 2nd and 4th Quarters After Exit Denominator X 100. This definition will not change as it is correct as written.
No other changes have been made in response to these comments.
Performance Indicator -- Effectiveness in Serving Employers -- Implementation:
Several commenters supported the flexibility that the States have to pick two of the three Effectiveness in Serving Employers measures. One commenter requested clarification regarding whether a State will be choosing the same two of three measures for all of the WIOA core programs or whether each core program could choose a different set of measures during the trial period. One commenter expressed concerns that implementing a statewide employer effectiveness measure would be difficult because each program would be utilizing its own reporting system.
A few commenters noted that the numerator and denominator of the Repeat Business Customer measure appeared to be reversed. They stated that the denominator should be the total number served in the current year and the numerator should be a subset of the denominator. One commenter recommended that the Repeat Customer indicator include those served more than once in the current year. Another commenter recommended that the Departments modify the Effectiveness in Serving Employers report to break out Repeat Customer performance by size of employer to better account for State-specific differences when comparing data or target setting. One commenter recommended that States only be required to report data element 1618 (Retention with the same employer in the 2nd Quarter and the 4th Quarter after exit) if the State elects this as one of its measures for Effectiveness in Serving Employers. Another commenter requested clarification regarding whether employer services are required to be reported regardless of the two employer effectiveness measures chosen by the State.
Agencies’ Response, June 2016: For the trial period, each State will select two of the same three Effectiveness in Serving Employers measures to apply to all of the core programs within the State. The Departments encourage State core program agencies to work collaboratively to determine the one shared outcome for each selected measure. The Departments plan to issue guidance on this topic.
With respect to the Repeat Business Customer measure, while the Departments disagree with the recommendation to reverse the numerator and denominator, we agree that the proposed language is unclear and have modified the language to add clarity. In response to this comment, the Departments have modified the language to read:
Repeat Business Numerator:
Record the total number of unique business customers (establishments, as defined by the Bureau of Labor Statistics Quarterly Census of Earnings and Wages program) that received an employer service or, if it is an ongoing activity, are continuing to receive an employer service or other assistance during the reporting period (E1), AND that received an employer service anytime within the previous three program years.
Repeat Business Denominator:
Record the number of unique business customers (establishments - as defined by the Bureau of Labor Statistics Quarterly Census of Earnings and Wages program) that received an employer service anytime within the previous three program years.
Regarding the suggestion to include those served more than once in the current year for purposes of the Repeat Customer measure, the Departments note that the fact that a business customer is served multiple times in a year would have no bearing on the calculation of the Repeat Customer measure. Even if a customer was served multiple times during a program year the resulting Repeat Customer measure calculation would remain the same. No change has been made in response to this comment.
The Departments concluded that additional burden would be imposed on the States if the Departments were to implement the recommendation that the Departments modify the Effectiveness in Serving Employers Report to breakout Repeat Customer performance by size of employer. No change has been made in response to this comment.
Finally, retention with the same employer is one of the three options for this indicator. If a State selects this option, it must also provide data on either the Employer Penetration Rate or Repeat Customers. For the two measures that a State selects for employer effectiveness, the State must report the data in the Program Performance Report.
Annual Report Template -- Effectiveness in Serving Employers:
One commenter questioned why the annual report template requires the submitter to identify the program, when the employer effectiveness data are supposed to be reported in the aggregate across all six core programs. The commenter further questions why the submitter must only identify the DOL-administered core programs.
Agencies’ Response, June 2016: The Departments agree with this comment and will delete the referenced programs from the annual report template. The Departments have deleted the contents of box #3 on the “Effectiveness in Serving Emp.” Tab in Appendix C.
Performance Report Template and Specifications:
One commenter asked what the intent of the Program Performance Report (ETA-9173) is and what reports it would replace. Some commenters noted that the Departments posted two sets of differing data specifications for the Program Performance Report Template and Specifications (Appendix B and Appendix C) in the Joint Performance ICR. Another commenter noted that there were mistakes in the Program Performance Report Template and Specifications. For example, the data element titled Measurable Skills Gain: Enrolled in Secondary Education Program did not contain the code value “1”. Another commenter noted that in Appendix C, it is unclear whether the denominator for the Employee Retention Rate is either the number of clients employed during the 2nd quarter following exit or the number of clients who exited the program two quarters prior.
Agencies’ Response, June 2016: The ETA-9173 is designed to display the aggregated values from the State data submitted using the PIRL format. For performance reporting under WIOA, the ETA-9173 will replace prior forms such as the 9002 or 9090. DOL is currently building a new reporting system to gather the WIOA-required data elements and will be providing guidance on this system and its implementation.
Appendix B was mistakenly included in the Joint Performance ICR and was duplicative of Appendix C. The Departments removed Appendix B from the Joint Performance ICR. Additionally, the Departments made several minor typographical edits to ensure consistency and uniformity. For example, the missing code value “1” was added to the data element titled Measurable Skill Gains: Enrolled in Secondary Education Program. The technical specification for the Employee Retention Rate indicator reads as follows: Retention with Same Employer in the 2nd and 4th Quarters After Exit Numerator divided by Retention with Same Employer in the 2nd and 4th Quarters After Exit Denominator x 100. This definition is correct as written, and the denominator indicates the number of participants who were employed during the second quarter following exit. No change has been made in response to this comment.
Joint PIRL -- General Comments:
One commenter stated that it was unclear whether the published document DOL PIRL (ETA-9172) represented a data dictionary or the report layout. The commenter inquired as to when the initial data collection format would be released for public comment. One commenter recommended that the Departments provide detailed data specific SQL statements as part of the definition of each data element. According to the commenter, such technical specifications would remove any ambiguity, greatly reduce development time for programmers and database administrators, and reduce errors. One commenter noted that the column headers in the PIRL represent only DOL-funded programs and, therefore, it is not clear what programs each column represents.
One commenter recommended that in addition to collecting disaggregated data for Asian American (data element 212) and Native Hawaiian/Other Pacific Islander (data element 214) participants, the Departments should require States to collect disaggregated data on Asian American and Pacific Islander (AAPI) participants. One commenter requested that the code value “not in the labor force” identify those individuals who were not employed and not actively looking for work when they entered the programs. This commenter also supported inclusion of educational attainment in foreign countries in the definition of “highest educational level completed at program entry” (data element 408). Another commenter recommended that a code for youth placed in long term employment subsidized by some other non-governmental funding stream be included in the code values for title I-specific data elements 1900 and 1901.
One commenter requested clarification regarding whether agencies are required to collect all eligibility information at the time of providing staff-assisted services and how to serve individuals that refuse to provide information at program entry. The commenter also requested further clarification regarding Supportive Services.
One commenter requested clarification regarding whether VR agencies are required to report the post-exit data elements for the first and third quarters after the exit quarter.
One commenter stated that information regarding TANF, SSI and SSDI participation would be burdensome for agencies and programs that do not have access to a State system for verification of status. Several commenters noted that the reporting requirements for post-exit data are burdensome for programs that rely on data gathered by case managers.
Agencies’ Response, June 2016: The PIRL (ETA-9170) is not intended to be used as a data reporting form and, instead, represents a data collection instrument including the data element names, definitions, and code values. The actual data will be submitted in a comma delimited format for DOL programs. The VR data collection instrument is the RSA-911, and it will also be collected in a comma delimited format. The AEFLA data collection instrument is the AEFLA Reporting Tables in the National Reporting System for Adult Education.
With respect to the recommendation that the Departments include data specific SQL statements as part of each data element, writing such technical specifications would require additional resources and would be complicated by the variation in data element definitions among the core programs (e.g., participation, exit). The Departments provided a general template for SQL statements by creating a “plain text” specification as well as a pseudo-code version of the specification that is intended for use by programmers and database administrators. No change has been made in response to this comment.
The race categories included in the Joint Performance ICR reflect those categories defined in OMB Statistical Directive 15 FRN, 97 Fed. Reg. 28653 (October 30, 1997). Furthermore, the level of disaggregation required is aligned with what is statutorily required by WIOA. The Departments considered the requests to add the code value “not in the labor force” to identify those individuals who were not employed and not actively looking for work when they entered the programs and to include educational attainment in foreign countries in the definition of “highest educational level completed at program entry” (data element 408); however, no change was made in response to these requests. The Departments have determined that the six code values included in title I data elements 1900 and 1901 are sufficient to accurately measure placement for youth in the 2nd and 4th quarters after exit .
With regard to the request for clarification regarding staff-assisted services, supportive services, and guidance on how to serve individuals that refuse to provide required information at program entry, the Departments will issue guidance to address these issues.
For the VR program, the employment data for the first and third completed quarters after exit are only required when necessary to document credential attainment for students who attained a secondary education credential. The collection of first and third quarter after exit employment and wage information is integral in the calculation of the credential attainment rate. Specifically, the number of participants who exited must meet BOTH of the following criteria:
Be enrolled in a secondary education program and obtain a secondary school diploma or its equivalent during the program or within one year after exit; AND
Be employed or enrolled in an education or training program leading to a recognized postsecondary credential within one year after exit.
The collection of TANF, SSI and SSDI information is not dependent on an agency’s access to a State system to independently verify enrollment in these programs. This data can be collected from the participant via self-attestation or source documentation provided by the participant. Section 167(c)(2)(c) of WIOA requires the use of the primary indicators of performance described in sec 116(b)(2)(A), including the “percentage of program participants who are in unsubsidized employment during the fourth quarter after exit from the program,” and “the percentage of program participants who are in education or training activities, or in unsubsidized employment, during the fourth quarter after exit from the program.” These are statutory requirements. No changes have been made in response to these comments.
Joint PIRL -- Data Elements:
Joint PIRL -- Data Element 805 (Cultural Barriers):
One commenter stated three concerns related to this data element: 1) the collection of this data element is overly burdensome due to program participants being unware of cultural barriers that may hinder their change of employment; 2) it will be very difficult to pose this question to program participants in a way that does not appear judgmental or culturally biased; 3) the usefulness and scientific viability of the data element is suspect. Another commenter stated that the definition of “cultural barrier” is too narrow and relies on an inaccurate assumption that participants can understand and self-identify if they have cultural barriers to employment. The commenter recommended that the definition of “cultural barrier” be broadened to include limited English abilities. One commenter asked whether cultural barriers to employment at program entry refer to widespread beliefs by SSI/SSDI recipients that employment could result in the loss of health benefits.
Agencies’ Response, June 2016: Section 116(d)(2)(B) of WIOA requires the collection of data with respect to individuals with barriers to employment, which, based on the definition in section 3(24)(I) of WIOA, includes individuals facing substantial cultural barriers. This measure is based on the participant’s perception of himself or herself. The Departments will provide guidance on collecting this data and may use this data to help design service delivery approaches and inform future guidance. The definition of “individuals with barriers to employment” in section 3(24)(I) also includes individuals who are English language learners, and individuals with low English proficiency are also included as a separate category of individuals with barriers to employment in the Joint Performance ICR because it is a characteristic of participants that is a required factor for purposes of the Statistical Adjustment Model. The Departments have determined that “cultural barriers” is a sufficiently broad term such that it permits self-identification. With regard to the comment on whether it is a cultural barrier that SSI/SSDI recipients have a perception that employment could affect their health benefits, the Departments do not believe that an individual’s perception about receipt of SSI or SSDI benefits is based on a cultural belief, but rather on a lack of information about the effect of employment on health benefits. No changes have been made in response to these comments.
Joint PIRL -- Data Element 808 (Migrant and Seasonal Farmworker Status):
One DOL commenter noted that previously, under WIA, NFJP grantees were able to extend the eligibility period for individuals who were unable to work due to circumstances such as incarceration beyond the most recent 24 months. The proposed wording of this PIRL data element would disqualify those individuals.
Agencies’ Response, June 2016: This data element reflects the WIOA statutory definition of seasonal farmworkers and migrant farmworkers (sec 167(i)(2) and 167(i)(3)). No change has been made in response to this comment.
Joint PIRL -- Data element 1618 (Retention with the same employer in the 2nd Quarter and the 4th Quarter):
One commenter stated that the specification for this data element needs to take into account the fact that participants, not infrequently, have multiple employers.
Agencies’ Response, June 2016: The Departments agree with this comment and will address this issue in subsequent guidance.
Joint PIRL -- Funding Stream Coding Issues:
Two commenters raised several concerns with respect to the funding codes:
VR should be “VOCATIONAL REHABILITATION > 0 and < 9”, not “> 0 and > 9”.
The propose Joint PIRL includes separate Local Workforce Investment Areas (LWIA) WIOA Funding Stream Coding for the AEFLA, VR, and Wagner-Peyser programs, which is not consistent with the Departments’ response to a comment following the 60-day FRN on the Joint Performance ICR in which the Departments stated that they were going to require only the title I programs to be included at this particular data element because such decision was consistent with the requirements of WIOA §116.
The LWIA funding codes for WIOA Adult, Dislocated Worker, and Youth only count Participants funded with Local Formula funding. This ignores the fact that a title I participant can be served with both Local Formula and Statewide funding.
Appendix C refers to “LWIA WIOA Funding”. To remain consistent with WIOA language, shouldn’t this read LWDA?
Agencies’ Response, June 2016: We agree with the comment regarding the VR coding error. This will be revised in the final version of the Joint PIRL by switching > with < before 9. We also agree with the commenter regarding the inadvertent inclusion of non-title I programs. We will remove the AEFLA, VR, and Wagner-Peyser programs from PIRL 108. With respect to the comment regarding a title I participant being served with both Statewide and local funding, in such circumstance, the program must input the local area code in field 108. The commenter is correct that WIOA refers to Local Workforce Development Areas (LWDA) not Local Workforce Investment Areas (LWIA), as was used under WIA. The Funding Streams section of Appendix C WIOA Statewide and Local Performance Report Template and Specs will be updated to match the WIOA language. The Departments have revised the title heading of the “LWIA WIOA Funding Stream Definitions” table to “LWDA WIOA Funding Stream”.
Joint PIRL -- When a Participant is served by more than one Local Board:
One commenter identified a problem with having multiple Local Boards serving a participant during the same period of participation, because different Boards use funds from different programs to cover the costs of providing the participants’ services. The commenter recommends that States should use a single PIRL row, and identify multiple Boards.
Agencies’ Response, June 2016: In an effort to minimize reporting burden, the Departments decline to do as the commenter recommends. Instead, the annual report template will capture only one record input per participation period. However, to resolve similar comments received pursuant to the 60-day FRN, the Departments divided PIRL 108 into three sections. With that change, three separate serving Boards can be input into the PIRL. No further change has been made in response to this comment.
Annual Report Template -- Miscellaneous:
Many commenters express concern about the current Program Performance report template. Specifically, one commenter noted that it does not include participants under16 years of age. The commenter questioned how this omission is consistent with the title I Youth program that serves in-school youth aged 14 and 15, and asked whether the title I Youth program would be held accountable for the services provided to these youth. Another commenter suggested that the earnings measure for youth should exclude youth who are enrolled in postsecondary education or training, otherwise, postsecondary enrollment (a good thing) would suppress the earnings outcome by reducing hours of work. Otherwise, as the Departments acknowledge (Appendix A, p. 5), the youth program would have a disincentive to encourage and enable youth to continue on into postsecondary education or training.
Another commenter asserts that subpopulations of participants should be included in the report template, such as participants: (1) who are limited-English and who have less than a high school diploma or GED equivalency; (2) who are just limited in English; and (3) who are just limited in foundation skills/literacy.
One commenter suggested specific changes to various elements of the template, and some additional elements. For example, the commenter proposed using the number of years of schooling received for purposes of the credential attainment performance indicator, rather than the degree attained, for better statistical analysis. Furthermore, the commenter recommended that the reporting of race should conform to the format developed in the Census Bureau Alternative Questionnaire Experiment (AQE). The commenter also recommended that a new data element be added to the template to capture specific types of occupational skills’ licensure or certification.
A commenter sought guidance with respect to how non-significant career services are tracked for individuals who are not yet participants, since only participants provide data elements. The commenter questioned how it could be permissible for a program to expend Federal funds for follow-up activities when the participant has already exited the program. This commenter also sought clarification on how States should track participants who receive only career or training services versus those participants who receive both career and training services. This commenter also noted that specifications for measures in each appendix are not matching, particularly with respect to exclusions within measures.
A commenter suggests that the Participant and Program Reporting formats be modified to specify level of oral English-language ability, using the English-language proficiency scale in the American Community Survey question 14c (i.e. speaks English: very well, well, not well, not at all) and to use the literacy data element to reference three key domains of English-language proficiency (reading, writing, and numeracy).
Agencies’ Response, June 2016: The WIOA Reporting Template found in Appendix C - WIOA Statewide and Local Performance Report Template and Specs contains a section on participants under the age of 16. This is the “<16” row found in the “Age” section of the report template.
In order to minimize the reporting burden, the breakout categories included on the WIOA Statewide annual report are those specifically required by statute. However, the DOL-administered programs will have individual records available for ad hoc analysis. The Departments do not believe that there is statutory authority to exclude, by regulation, youth from the earnings indicator because to do so would be inconsistent with specific requirements in section 116 of WIOA. We want to make clear that other non-youth participants will also be included regardless of school status, or hours worked per week.
The Departments note that the data element number for this element (Low levels of literacy/basic skills deficient at program entry) has been renumbered from 704 to 804, because 704 had previously been mistakenly used for two data elements (“Single Parent” and “Low Levels of Literacy”). This has been corrected; however, the discussion of this data element in Appendix C referred to this element by the previous numbering as 704. In response to the commenter’s suggestion that this element should be listed within the Equal Employment Opportunity series of data elements, the Departments believe element 804 makes logical sense in the “cluster” of elements as proposed. To minimize the reporting burden, the Departments recommend keeping the ELL “yes/no” field. If the participant has any level of difficulty, they should be coded “yes”.
States will not be required to report data on receipt of career services and training services by reportable individuals.
The tracking and reporting of cost information will be addressed in future guidance.
Appendix B is no longer necessary because all of the report specifications are incorporated in Appendix C. Appendix B will be removed from the final approved Joint Performance ICR.
DOL-Only PIRL – General:
One commenter noted that there are no definitions for the DOL-only PIRL column header titles. Therefore, it is not clear what programs each column represents. Recommend providing a detailed definition or explanation for each.
Two commenters noted that Section E.08 of ETA-9172 added a significant data collection burden that will apply to the core programs.
Another commenter expressed concern about a data element (#904) relevant to the Dislocated Worker program because the timeframe is undefined. The commenter provided different options for how the timeframe could be interpreted: “ever” or “current POP to end of reporting period” or “Year ending with current reporting quarter” or “current PY to current reporting quarter.” The commenter raised similar concerns about the undefined timeframe for the “training provided” series (#1300-#1312). The commenter also noted that these data elements permitted three different training engagements, but only collected data on one of them. Furthermore, the “training services provided” data elements (#1302, #1307, and #1312) have missing codes.
Agencies’ Response, June 2016: The column headers in the PIRL represent the DOL-funded programs. Each program is responsible for collecting all of the data elements marked with an “X” under that program’s column.
The data elements collected in section E.08 are part of the DOL-only ICR and are not applicable to the AEFLA and VR programs. The Departments carefully considered the burden on agencies and the statutory and program management requirements when determining which data elements to include in the Joint Performance ICR.
We agree that there are three different types of engagement, but we are only collecting data related to those participants who are receiving training or education services from an ETP. With respect to other specific DOL-only PIRL issues, the Departments will provide further guidance in the near future. No change has been made in response to these comments.
Wage Record Matching – WRIS and WRIS II:
Although the commenter appreciates that the Departments intend to renegotiate the WRIS and WRIS II agreements at the Federal level, States will need significant lead time to set up their data sharing agreements and data exchange processes. For example, VR agencies currently do not report on retention measures and, therefore, these agencies will need time to set up the agreements and information exchange processes with their respective UI agencies to include State, WRIS/WRIS2, and federal wage information to accommodate their case management systems.
Agencies’ Response, June 2016: WRIS is not covered or discussed in this ICR. However, the Departments plan to continue to involve the States in the data sharing agreement process. No change has been made in response to this comment.
Wage Record Matching – Supplemental Wage Information:
A few commenters expressed concerns regarding the use of supplemental wage information, for purposes of gathering the data necessary for the employment-related primary indicators of performance, and the burden that could be imposed by having to engage in follow-up activities. At least one commenter strongly supports the Departments’ proposal to allow States, local areas, ETPs, and other program operators to use supplemental data sources to track performance outcomes for all participants who are not found in wage records. This provides program operators with flexibility needed to provide services and obtain performance outcome data for these participants. However, another commenter asserted that allowing the use of supplemental wage information would be a “major step backward from the progress that was made during the Workforce Investment Act in using wage and other administrative records as the standard data source for employment and earnings measures.” The commenter added that allowing the use of supplemental wage information would be a “step away from the substantial investment” the Federal government and States have made in linking administrative records.
Agencies’ Response, June 2016: The Departments have decided to keep the earlier decision to allow supplemental data sources to determine performance outcomes. The Departments have decided to keep this PIRL element as-is, per the comment.
The Departments agree that the use of administrative wage records is critical to implementation of the performance provisions of WIOA because the number of participants with missing data elements needed for a wage record match, including SSN or missing wage data, could be significant in some cases. Completely excluding these participants from program accountability measures would result in a significant gap in performance data and would not be aligned with the requirements of WIOA to obtain outcomes on all participants. The policy for States to voluntarily use supplemental data to report on employment and earnings data is intended to provide alternatives to States that may have significant numbers of participants who would not have coverage in an administrative wage record match. The Departments will issue guidance to ensure standardized and rigorous procedures are used when reporting supplemental data.
No change has been made in response to these comments.
Performance Workgroups – Ensure Transparency and Input:
One commenter noted that the Departments have established two workgroups for stakeholders to provide input about the challenges faced during implementation of WIOA’s requirements. The commenter requested that the Departments ensure the activities and recommendations of the workgroups are transparent to the States and that all key stakeholders have the opportunity to provide input into the process. The commenter also recommended that membership on these workgroups not be limited to State representatives, but rather membership should include one-stop system partners, including community-based organizations with direct experience serving individuals with barriers to employment, such as those who are limited English proficient, have low literacy, or have basic skills.
Agencies’ Response, June 2016: The Departments established State-level workgroups for feedback from States on technical assistance matters. The workgroups do not provide recommendations to the Departments. Furthermore, the groups are comprised of State representatives only, as the grantee relationship is with the Department, and as such, does not invoke a Federal Advisory Committee.
The Departments may consider establishing a more broad-based performance accountability team to work through implementation issues in the future.
Exhausting TANF Within 2 Years:
One commenter stated that TANF’s potential two-year exhaustion may not be an adequate proxy data element for the “welfare dependency” characteristic required for the statistical adjustment model under section 116 of WIOA.
Agencies’ Response, June 2016: PIRL 601, Exhausting TANF Within 2 Years, records whether a person is an “individual with a barrier to employment” because the individual is “within 2 years of exhausting lifetime eligibility under part A of title IV of the Social Security Act (42 U.S.C. 601 et seq.),” as defined in WIOA sec. 3(24)(K). No change has been made in response to this comment.
Reporting Consistency – Test Publisher’s Guidelines:
Pursuant to page 3 of Appendix A, one commenter noted that WIOA does not require programs to follow test publisher’s guidelines. The commenter expressed concern that this could lead to inconsistent reporting across programs. The commenter recommended that each program be required to follow the test publisher’s guidelines when administering a test.
Agencies’ Response, June 2016: All programs reporting under section 116 will use common definitions and procedures for reporting educational functioning level, including how tests are administered in accordance with test publisher guidelines. Joint performance guidelines will clarify this requirement. No change has been made in response to this comment.
One commenter sought clarification regarding several different issues related to the title I Youth program.
Definition of “high poverty area” – The commenter requested the Departments provide a definition for the term, and asked whether a Federal “Promise Zone” automatically constitutes a “high poverty area.”
Cultural barriers at Program Entry – The commenter questioned whether “cultural barriers at program entry” is a subset of the “Individual who requires additional assistance to complete an educational program or to secure/hold employment” eligibility criteria. In addition, the commenter questioned whether the applicant would self-certify to cultural barriers at program entry.
Definition of “homeless youth” – The commenter supported the expansion of the definition of “homeless youth.” However, the commenter questioned whether service providers must go back and reclassify youth as being homeless under the expanded definition in those situations when the youth did not meet the prior definition of “homeless.”
Incarcerated youth – The commenter questioned whether incarcerated youth would be excluded from all performance measures under WIOA, as they were under WIA. However, the commenter noted that it appears as though incarcerated youth would fall within the scope of the measurable skill gain indicator. The commenter sought clarification on which, if any, of the performance indicators are applicable to incarcerated youth.
Agencies’ Response, June 2016: The definition of “high poverty area” is beyond the scope of the Joint Performance ICR; however, 20 CFR 681.260 of the DOL-only Final Rule implementing titles I and III of WIOA will include a definition of a high poverty area.
“Cultural barrier” is a separate data element and is not tied to eligibility criteria for the title I Youth program. Yes, participants will be able to self-identify for purposes of this data element.
Local service providers are not required to go back and reclassify youth who now satisfy the expanded definition of a “homeless youth.”
WIOA Youth program participants in programs that serve incarcerated individuals under section 225 of WIOA would be included in calculations for the measurable skill gains indicator, but would not be included in calculations for any of the other performance indicators.
No change has been made in response to these comments.
DOL-Only -- INA -- Miscellaneous:
A commenter expressed concerns and sought clarification regarding various issues related to the INA program.
The commenter asked which reports will replace the ETA 9085;
The Department’s analysis for the tribal youth program (Appendix A, pg. 73) indicates a majority of the data elements proposed in this ICR are currently being collected by the INA program. In reviewing the new ICR and comparing it to the current INA tribal youth reporting system, less than 25% of the (20/85) data elements proposed in the ICR PIRL are being collected for youth. Additionally, proposed changes will require individual reporting. This is new to the youth program and will require significant changes in data reporting and will be costly to implement. Therefore, the commenter requests additional funds to ensure compliance by the Youth programs.
Request clarification and purpose for the collection of 10 additional miscellaneous data elements in Section E.08 for INA. This appears to be an error.
Levels of performance targets are a concern since there is limited, in some cases, no statistical data available that capture economic circumstances for American Indians and various tribal areas. Furthermore, the Bureau of Labor Statistics lacks data specific to our unique and special population. It is the commenter’s hope there will be meaningful dialogue and strategic planning in setting INA levels of performance in accordance with the goals and intent of WIOA Section 166.
The commenter looks forward to working with the Department in consultation with NAETC and its expert workgroup members in addressing mutual commitments
Agencies’ Response, June 2016: The Department of Labor will be using the ETA-9173 report as a general “roll-up” report for all ETA programs. However, the Department will have the ability to generate reports specific to the INA program and also generate reports using various data elements in the PIRL that are being collected by the INA program.
The Department of Labor recognizes that small grantees will not have the expertise and resources to develop a system that collects and transmits individual participant records to the Department. Therefore, the Department will provide, and is currently providing, resources and expertise to develop a management information system for the Native American program.
When the Department of Labor made its assessment of the increase in data elements for the INA program, it viewed the data collection as a combination of adult and youth data elements. However, there is a significant increase in new data elements for the youth program when the programs are viewed separately, but it should also be noted that some of the current data elements relating to the youth performance measures will no longer be needed which offsets some of the reporting burden of collecting new data elements.
The Department recognizes that labor market information is limited on American Indian reservations and may not be as accurate as non-reservation areas. However, the statistical adjustment model will also factor in the characteristics of participants served by the grantee and, therefore, it is not totally dependent on labor market information. Nevertheless, the limitation of labor market information on American Indian reservations increases the importance of consulting with the Native American Employment and Training Council (NAETC) when establishing performance targets for grantees and it is the Department’s intent to establish fair and reasonably attainable levels of performance for all INA grantees.
The Department is committed to working with the INA grantee community and the NAETC in all of these areas.
DOL-Only Reporting -- Section B – One-Stop Participation Information:
Because NFJP grantees operate their own case management and data management programs, they can only reasonably be expected to report participation in other WIOA programs for individuals for whom they arrange co-enrollment. According to one commenter, there is not consistency among one-stop operators from service area to service area or State to State relating to the amount of cooperation and data-sharing that States are willing or legally able to do with non-State agencies.
Agencies’ Response, June 2016: In cases where a grantee does not arrange co-enrollment of a participant with another WIOA program, data can be collected from a participant via self-attestation or source documentation provided by the participant.
No change has been made in response to this comment.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Shelia F. Lewis |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |