Green Jobs Site Visits Emergency Supporting Statement - Part A - 12 27 2011

Green Jobs Site Visits Emergency Supporting Statement - Part A - 12 27 2011.doc

Site Visit Data Collection Request for ARRA-funded Grants; Job Training Evaluations

OMB: 1205-0486

Document [doc]
Download: doc | pdf






Emergency submission; site visit data collection request for ARRA-funded grants; job training evaluations

Table of Contents




Part A: Supporting Statement for Paperwork Reduction Act Submission: Site Visit Data Collection

The U.S. Department of Labor’s Employment and Training Administration (ETA) is seeking Office of Management and Budget (OMB) emergency approval to collect site visit data from organizations that received grants under four Solicitations for Grant Applications (SGAs) that were issued under the American Recovery and Reinvestment Act (ARRA): Pathways Out of Poverty (POP), Energy Training Partnership (ETP), State Energy Sector Partnership (SESP), and Health Care and Other High Growth and Emerging Industries Training grant initiative. POP, ETP and SESP are all Green Jobs training programs. The overall aim of these evaluations is to determine the extent to which enrollees achieve increases in employment, earnings, and career advancement as a result of their participation in the training provided and to identify promising best practices and strategies for replication. While the full evaluations involve several data collection efforts including surveys, ETA seeks emergency clearance at this time only for the site visits data collection. The request for clearance for the surveys will be submitted at a later date using the regular clearance process.

  • Process Study Site Visits.

Implementation Evaluation:

For the implementation evaluation, one round of site visits is conducted with 36 grantees. The information collected through in-depth interviews during site visits will provide important contextual information on the effectiveness of the grants in different environments. It will also help ETA assess whether the program is particularly effective in certain types of communities, with specific populations and in certain environments.


Impact Evaluation:

This research activity involves conducting two rounds of site visits to the four grantees in the impact evaluation for the purpose of documenting the program environment, participant flow through random assignment and program services, the nature and content of the training provided, the control group environment and grantee perspectives on implementation challenges and intervention effects.

During the visits, site teams will interview key administrators and staff (including program partners and employers), using a semi-structured interview guide and will hold focus groups with participants (first round only). The interview guides are presented in Appendix A.



Request for Emergency Clearance for Site Visit Data Collection: Emergency clearance is necessary for this site visit data collection for several reasons. The POP and ETP grants expire at the end of January 2012; consequently, it is necessary to collect data from the POP and ETP grant sites included in both studies while they are still in operation. Were the normal Paperwork Reduction Act clearance procedures used in this site visit data collection, ETA would have too short a time to study the POP and ETP grants operations before they expire. Also, losing the POP and ETP grants would mean the studies would lose its ability to look at green jobs. The POP grants are the only green jobs training grants available for the impact evaluation and there is only one site from the POP grants that met the selection criteria for the impact study. There is keen interest in learning more about the POP grants in general of which site visits to eleven are included in the implementation study.


Failure to collect site visit data in a timely manner will affect conducting rigorous evaluations of these grants. For example, site visits are the only way the research team can observe the training programs in operation and collect real time data that amplifies the findings through other documentation. Lack of a rigorous evaluation process will mean that no information will be available on the potential of training for green jobs as a strategy for reducing poverty or increasing employment. Finally, approximately 9 million dollars of ARRA funds are dedicated to evaluating these grants. Failure to conduct a rigorous evaluation with appropriate operational information gathered through site visits will make these evaluations less educational and therefore less usable in terms of guiding future policy initiatives. Conducting these evaluations without appropriate operational data collection would be a waste of the taxpayer stimulus dollars that are currently dedicated to these evaluations. Not having opportunities to learn about these innovative grants, in which over $ 600 million has been invested, is a misuse of dedicated ARRA funding.


In summary, delaying the site visits will make it impossible to record the operational information on the green jobs training program to better interpret the impacts on participants’ earnings of training in green and emerging and high growth occupations.


A. Justification

1. Circumstances Necessitating the Site Visit Data Collection

As part of a comprehensive economic stimulus package funded under 2009 ARRA, DOL funded a series of grant initiatives to promote training and employment in selected high-growth sectors of the economy. Individuals facing significant barriers to employment, as well as those who were recently displaced as a result of the economic downturn, are the high-priority labor pools targeted by these ARRA initiatives. High-growth and emerging industries are emphasized as part of the ARRA’s focus on labor demand, with a particular focus on emerging “green” sectors of the economy and pressing skill shortages in health care fields. These grant programs are consistent with ETA’s emphasis on more “customized” or “sector-based” labor market solutions, and job seekers, including incumbents, facing significant barriers to economic self-sufficiency become a resource to targeted growth sectors facing skill shortages or anticipating hiring needs.

ARRA’s focus on the needs of high-growth and emerging industries to hire additional workers comes at a critical time. During periods of both recession and expansion, it is important that employers remain attentive to the challenge of building and maintaining a productive workforce to ensure their long-term competitiveness. This applies particularly in industries such as health care, education, and energy, in which the Bureau of Labor Statistics projects significant job growth over an extended time (Bureau of Labor Statistics 2010). However, several factors, including declines in educational attainment among American workers, a skilled workforce that is aging and in need of replacement for retiring workers, and continued immigration are affecting workforce skill levels and employers’ ability to remain competitive and increase productivity (Dohm and Shniper 2007). Training programs like those funded by ARRA are designed either to provide these skills or to begin an entry-level career path toward acquiring them.

ETA’s grant programs represent an important step to increasing postsecondary education and training in high-growth areas, particularly health and green jobs. They provide needed resources to provide training, encourage partnerships between different service delivery systems, feature strong employer involvement, and focus on the provision of innovative and promising training strategies. To learn about the impacts of this significant investment of resources in training programs, ETA has funded rigorous evaluation using a random assignment research design and a comprehensive implementation evaluation.

Previous research in the training field has provided insight into the educational and economic effects of training on participants. However, many of the studies did not use random assignment, which leaves them open to concerns about selection bias and makes it difficult to determine what outcomes would have been in the absence of the training services. To assess the impacts of these training programs effectively, a rigorous design and implementation of random assignment are required. The comprehensive implementation evaluation is aimed at clarifying the net impact findings with contextual knowledge.

Overview of the Implementation Evaluation

The primary objectives of the Green Jobs and Healthcare Implementation Study are to:

  • Understand in-depth the implementation of the 152 grants that were awarded under the Recovery Act

  • Explore the extent to which grantees employed promising practices that could possibly be replicated and scaled in future programs

  • Evaluate whether deployment of these promising practices by grantees are associated with positive participant outcomes in employment and earnings.


To address these objectives, a set of research areas was defined, covering: 1) the economic and community context in which each program operated; 2) the service delivery strategy and components of the program; 3) partnerships with employers and other organizations; 4) program management, funding, and sustainability; and 5) program outcomes. Sources of data for the study include site visits to 36 grantees out of the 152 total.


Overview of the Impact Evaluation

The overriding goals of this evaluation are to determine the extent to which enrollees achieve increases in employment, earnings, and career advancement as a result of their participation in training provided by the Green Jobs and Health Care grantees and to identify promising practices and strategies for replication. The impact study will use an experimental design involving random assignment to measure the impact of the program, as well as a process study to examine implementation and operations. The random assignment study will be conducted in four grantee programs. ETA will select grantees based primarily on the perceived strength and scale of their intervention. Therefore, we will not supply estimates of the impact of the grant programs as a whole, but rather will provide results on interventions operated by selected grantees.

The evaluation will address the following research questions:

  • What is the impact of the programs on the receipt of education and training services, in terms of both the number who receive these services and the total hours of training received?

  • What is the impact of the programs on the completion of educational programs and the receipt of certificates and credentials from the training?

  • What is the impact of the program on the employment levels, earnings, and career advancement of participants?

  • To what extent do the programs result in any employment (regardless of sector)? To what extent do the programs result in employment in the specified sector in which the training was focused?

  • What features of the programs are associated with positive impacts, particularly in terms of target group, curricula and course design, and additional supports?

  • What are the lessons for future programs and practices?

For this evaluation, the treatment condition is defined as having the opportunity to enroll in training funded by either the Green Jobs or the Health Care grants. The treatment condition will vary from site to site depending on the grantees selected for the evaluation and the nature and context of the training programs those organizations choose to implement with their grant funds. The control condition, or counterfactual, is defined as not having the opportunity to enroll in training funded by Green Jobs or Health Care grants. However, control group members will not be prevented from enrolling in other locally available training programs or services in the community. We recognize that some people assigned to the control group will find opportunities to receive some form of training. This configuration—a comparison of access to the focal program’s services to other services in the community—is a common design for random assignment studies of training programs. It is also one that answers the relevant policy question: Does adding the program services funded by the Pathways and Health Care grants to the configuration of training services already available in the community improve participant outcomes?

At each selected impact study site, individuals will be randomly assigned to a treatment or control group. A total of 4,000 sample members will be selected overall, with target sample size totals varying by site, as shown in (Table A.1).

Table A.1. Sample Sizes for the GJ-HC Impact Evaluation



Site

Treatment Group Members

Control Group Members

Total Sample

Site 1

600

600

1,200

Site 2

575

475

1,050

Site 3

600

300

900

Site 4

425

425

850

Total

2,214

1,810

4,000





Overview of Data Collection

Addressing the research questions adequately requires collecting detailed data from multiple sources across multiple points in time.

Site Visits for the Implementation Study.

The ARRA funds also provided for conducting research in the energy efficiency and renewable energy industries. This Green Jobs and Healthcare Implementation Study will address research in six key areas: (1) Program Context, (2) Program Components and Service Delivery Strategy, (3) Partnerships, (4) Program Management, Funding, and Sustainability, (5) Program Outcomes, and (6) Program Replicability and Lessons Learned. Exhibit 1 summarizes the domains to be studied within the six key areas. Protocols may be found in Appendix A.


Table A.2. Research Areas and Domains


Research Area 1: Program Context

Program Context

Grant Timeframe

Program Structure

Policy Context

Research Area 2: Program Components and Service Delivery Strategy

Program Components

Assessment and Case Management Services

Participant Recruitment and Targeting

Support Services

Training Design/Delivery

Research Area 3: Partnerships

Partner Relationships and Selection

Partner Roles and Responsibilities

Research Area 4: Program Management, Funding, and Sustainability

Leveraged Resources

Use of Program Data

Sustainability

Funding and Program Administration

Research Area 5: Program Outcomes

Outcomes Achieved

Career Pathways and Certifications

Linkage Between Grant Program Practices and Outcomes

Research Area 6: Program Replicability and Lessons Learned

Program Replicability

Lessons Learned




To address these six research areas, data will be collected from in-depth interviews during site visits to the grantees. In-depth interviews will be conducted with grantee project directors and the following grantee partner organizations: education/ training providers, workforce partners (Workforce Investment Board or One-Stop Career Center), employer and business partners (including labor unions), and support service partners (case management and outreach). These interviews are needed to gather first-hand information about grant design, implementation, operation, and outcomes.


The research team will conduct one round of site visits to the selected sites from each of the four SGAs during the study period. These site visits will allow for the collection of in-depth program design and implementation information through 1) structured interviews with grant administrators, program staff, and partner organizations; 2) observation of grant activities; and 3) collection of program documentation. Focus groups will also be conducted during site visits.


Site Selection for the Implementation Study


Selecting a meaningful and representative sample of sites from each SGA is a critical design component associated with this study. Across the four SGAs, there are at total of 152 grantees. The research team plans to conduct site visits at 36 of these grantees. One approach to site selection would have been to randomly select these 36 sites. However, there is a possibility that doing so would result in a skewed sample which would over-represent or under-represent the 152 grantees along some dimension. For example, the 36 randomly selected sites might, by chance, contain a very small number – possibly even zero – of grantees from one of the SGAs. To address this concern, we adopted a methodology that included several steps.


To choose the 36 sites to visit, a matrix was developed which ranked the 152 grantees on five different categories based on guidance received from DOL/ETA. These five categories are listed below in descending order of importance:

  • Industry – Nursing/Allied Health, Other Healthcare/Emerging Industry, Renewable Electric Power, Energy Efficiency and Green Construction, and Other Green Industry

  • Grantee Organization Type – Community-Based Organization, Union, Education, Workforce Investment Board, and State Workforce Agency

  • SGA Type – SESP, HHG, POP, or ETP

  • Grantee Level – Local, Regional, State, or National

  • Geographic Diversity – the six DOL regions.


Once information was culled for each of the 152 grantees, the programs were sorted into three groups. The first group consisted of 36 grantees which represent the four different SGAs and six regions of the country. They are diverse in the types of industries they train individuals for and their organization types. These grantees tended to train individuals in multiple occupations and had a good diversity of partners on their grant team. The second group of grantees generally had good representation of the number of occupations individuals were trained for, as well as the partners in their organization. However, they represent less diverse areas or had fewer partners or industries than the 36 grantees in the first group. The remaining grantees generally had the smallest number of industries and grantee organizations in their programs and represented less diverse areas.



Site Visits for the Impact Study.

A rigorous random assignment evaluation requires clear and specific documentation of the services provided to treatment group members in each of the grantee sites and the services available to control group members. This qualitative information will enable the evaluation team to describe the program design and operations in each site, interpret the impact analysis results, and identify lessons learned for purposes of program replication. The process study site visits will include semi-structured interviews and focus group discussions with various program stakeholders. Potential respondents and discussion topics are listed below.

  • Interviews with administrators and staff (including instructors and counselors) at each site will document the program services provided to the treatment group. These interviews will collect detailed information on a full range of process study topics including: economic and programmatic context; program and organizational structure; programmatic priorities; recruitment; service delivery systems; key partners; linkages with employers; nature and content of training and support services; funding resources; and the sustainability of the grant program after the grant period. Our overriding aim is to gain a complete understanding of the range of factors (programmatic, institutional, and economic) that serve to facilitate or inhibit the successful implementation and operation of the program. These interviews will also allow us to identify and obtain information on other programs and services that may be available in the communities in which grant services are offered.

  • Interviews with key program partners (e.g. One-Stop Career Centers, TANF agencies, community colleges) will help us understand the historical aspects of the partnership, the current relationships among different collaborating organizations, and the range of services provided. We also hope to interview partners that may provide services to control group members.

  • Interviews with employers (two to three key employers from relevant sectors) will help us understand the extent to which critical “demand side” considerations have been integrated into the program model. The interviews will include discussions of employers’ roles in the planning process, their roles in program and curricula design, and their experiences with placement, hiring, and post-program employment of participants.

  • Focus group interviews with treatment group students will be important to understanding service utilization, reasons why services are or are not successful in achieving their goals, and insights on job advancement or job loss. To supplement the two follow-up surveys of participants, we will conduct informal group discussions with a small number of students in the treatment group as part of the first round of site visits. We will explore what treatment group members hear and know about programs and services; reasons individuals use or do not use program services; particular challenges they may face in attending or completing school; participant knowledge of available resources; and perceptions about the likelihood of career advancement. A protocol for these discussions is presented in Appendix A. For the focus groups, site staff assist us in identifying approximately eight students who would be available to participate in a focus group; this will be a convenience sample and will not be intended to represent the broader group of participants.


To develop such documentation, a team of two experienced evaluators will visit each site at two points over the course of the evaluation and follow a specific, detailed field visit protocol and interview guide. The first round of visits will be conducted immediately after OMB approval is received and will last three days. These visits will focus on documenting the initial implementation of the programs and will include interviews with administrators, staff, partners, and employers as well as focus groups. The second round of site visits will occur 9 to 12 months after the start of random assignment when programs have reached maturity. For the POP program, there may not be a second opportunity which makes conducting the first site visit while they are still in operation all the more important.

The site visit data collection will follow a standard protocol for conducting semi-structured interviews with selected staff and administrators. Site teams will conduct interviews with individual staff and administrators in a private office or room on-site following established procedures for maintaining strict individual privacy. Notes from the interview will be handwritten or entered onto a laptop computer. After each visit, the field notes will be stripped of any identifying information to guard against any violations of privacy provisions. Notes will be stored in a secure computer or file cabinet at Abt Associates or its partner that can only be accessed by the evaluation team. (See Appendix A for the interview instrument.)

Table A.3. Program Dimensions Examined in Process Study Interviews

Program Dimension

Key Respondents


Local context: Broad community context in which the program operates/services are delivered

  • Socioeconomic and ethnic profile of the population

  • Unemployment rates, availability of jobs, characteristics of available jobs

  • Range of education and training opportunities in the community

  • Availability of public and financial supports



  • Program administrators

  • Program partners

  • Employers


Organizational structure and institutional partners: Characteristics of the grantees: organizational characteristics, staffing, program partners.

  • Organizational Structure: size, operational structure, funding, history, leadership, linkages with other systems (local work force boards, community colleges)

  • Staffing: Number and roles of staff (planned and actual)

  • Partners: Program services offered and delivered, how services are coordinated



  • Program managers

  • Program service delivery staff (teachers, counselors, other professionals)

  • Program partners


Program design/operations: Strategies used by the program to deliver curricula/services or organize activities

  • Outreach and recruitment strategies (planned and actual)

  • Assessment and case management

  • Course requirements and schedules

  • Instructional methods and curricula

  • Counseling and other support services

  • Location of services, activities

  • Role of employers

  • Changes over time



  • Program managers and staff

  • Program partners

  • Employers


Service receipt/utilization:

  • Services received by treatment group members; number and type of services received; length of services

  • Other education, job training, and support service programs available to control group members



  • Program managers and staff

  • Participant focus groups


Participant perspective: Factors that affect use/non-use of services

  • How heard about services/messaging

  • Challenges/facilitators to using services



  • Participant focus groups



Implementation accomplishments/challenges: Factors that facilitated or impeded the effective delivery to services to participants



  • Program managers and staff

  • Program Partners

  • Employers





2. How, by Whom, and for What Purposes Will the Information Be Used?



ETA requests clearance to conduct site visits. The data from the site visits will enable the team to describe the program design and operations in each site, interpret the impact analysis results, and identify lessons learned for purposes of program replication.

The site visits for each study are described in detail below, along with how, by whom, and for what purposes the information collection will be used.

Site Visits for the Implementation Study

As site visits are a key component of data collection for this study, site visit protocols for these activities are carefully designed to address the research and meta questions

Site visit protocols will be used to guide data collection during the two day site visits. The protocols will serve as general guides to ensure consistency in data collection and complete coverage of topics. Below we provide an overview of the data collection activities to be conducted during site visits including: 1) unstructured interviews with program staff and partners, 2) observation of grant activities, and 3) collection of program documentation.

Semi-structured Interviews with Program Staff

A set of interview protocols will be developed for each of the SGAs. Each set of protocols will contain individual discussion guides for each of major stakeholders groups associated with implementation of the grant program, including:

  • Grantee Project Director/Manager

  • Education/training partners

  • Workforce partner (WIB, One-Stop Partner)

  • Employer and business partners

  • Support service partners.


A sample interview protocol can be found in Appendix A. This protocol represents an interview with the Grantee Project Director/Manager, the most comprehensive of all stakeholder protocols. As shown in Exhibit 3-6, to capture both the differences and similarities in the structures of the SGAs and the industry sectors/occupations, the discussion guides for each of the stakeholder groups will contain questions addressing similar research questions, as well as sections customized for each SGA. In addition, for multi-site grantees, we will develop guides appropriate for partners at both the national/state level and the local/regional level (e.g., SESPs and the local/regional project teams).



Observation of Program Activities


Research staff will observe grantee program activities when appropriate. With the permission of both the service provider and the grantee participant, researchers may observe the training and/or other directly observable activities associated with the program. Researchers will be trained to record program observations consistently using the observation protocol found in Appendix A.


Documentation Review and Collection of Quantitative Data


The site visit team will gather and review a variety of documents from key stakeholders that will provide relevant data about the sample of grant programs visited in each of the SGAs. Examples of relevant documents include: organizational charts, memorandum of understanding, budgets, data reports, brochures and other outreach materials, and other program materials. To improve the efficiency of the collection and review of this material, the team will contact each site prior to the site visit and request these materials be gathered and either sent in advance of the site visit or prepared for the site visitors upon arrival. The documentation review checklist found in Appendix A will be used to remind researchers of the materials to collect during their visit.



Site Visits for the Impact Study

The site visits will involve semi-structured interviews with administrators and staff, key program partners, employers, informal group discussions with students in the treatment group, and observations of program activities. The site visits are needed for the following purposes:

  • To describe the program design and operations in each site. Because the program as it is described “on paper” (in the grant application or other materials) may differ from the actual program being tested, researchers will collect data during the site visits that will enable them to describe the programs as implemented.

  • To examine the treatment-control differential to help interpret impact results. Impacts on employment patterns, job characteristics, earnings and other outcomes will presumably be driven by differences in the amount and/or types of training and other services received by members of the treatment and control groups. Because the control group can access training and other services outside the grant-funded program, during the site visits researchers will collect data that will enable them to describe and establish levels of service receipt for both treatment and control group members. For example, researchers will collect information on other sources of similar training (including those within the same institution) and sources of funding for training (e.g., other programs).

  • To identify lessons learned for use in replication efforts. Data collected during the site visits—considered within the context of the impact results—will be the key source for formulating recommendations about how to replicate successful programs or improve upon their results. These data will also be used to identify lessons learned about the relative effectiveness of particular training strategies. While it may not be possible to completely disentangle which factors are driving differences in impacts across sites, to the extent possible, the researchers will identify factors that appear to be linked to success, as well as those that are not.

Description of site visits

Two-person site teams will conduct two rounds of site visits. The teams will schedule their first visit shortly after clearance is received and will spend three days at each site. These visits will focus on documenting the initial implementation of the programs and will include semi-structured interviews with administrators, staff, partners, employers, group interviews with students in the treatment group, and observations of grantees’ activities. Site teams will conduct the second round of site visits 9-12 months after the start of random assignment when programs have reached maturity, and will focus on changes and developments in the provision of services as well as issues regarding the sustainability of the grant program. Given that we will already have a basic understanding of the program and its operation, these visits will be two days in length (vs. three days in the first round).

The site visit team will work closely with the sites to arrange for the most convenient but expeditious time to visit their program. The evaluation team will also hold site visitor training for all staff involved in the visits. After each site visit, the data and information collected will be summarized and maintained in site specific databases.

Site visit team members will use prepared discussion guides to conduct the semi-structured interviews, and will be guided by a set of protocols that cover the types of information required to advance our understanding of the training programs. The guide (see Appendix A) provides an outline of key topics of interest with sample questions and probes. The semi-structured nature of the interview guide is purposively designed to allow site visitors maximum flexibility in tailoring their discussions during specific interviews to the different perspectives of respondents and the unique circumstances that prevail at each site while still ensuring that all key topic areas of interest are addressed in each site visit. While we will try to capture as much detail as possible, the questions in the discussion guide will remain open-ended in style to allow for the greatest freedom for the respondent to answer in his or her own words.



3. Use of Improved Technology to Reduce Burden

The data collected through site visits will be recorded electronically and bears no burden on the grantees or participants.

4. Efforts to Identify Duplication

The data to be collected during the site visits are not available from any other source. There is no other data source providing detailed information on the program context, program services, control group environment, and implementation and challenges and successes. The first and second rounds of the site visits will provide different sets of information about program operations, with the first round of visits focusing on grant activities until that point and the second round of visits focusing on changes since the first visit.

5. Methods to Minimize Burden on Small Businesses or Entities

This data collection does not involve small businesses or other small entities.

6. Consequences of Not Collecting the Data

This information collected through the process study site visits will enable the team to describe the program design and operations in each site, interpret the impact analysis results, and identify lessons learned for purposes of program replication. The consequences of not collecting the data from the field-based implementation and process analysis is that there would be a lack of in-depth information about the nature of the strategies developed and employed at grantee sites to improve the educational and employment and training outcomes of the students they service. If the in-depth interviews and focus groups are not conducted there will be no information regarding the context, design, implementation, operation, outcomes, and/or replicability and sustainability of the grant programs. Site visits will provide an opportunity to fully document the services being delivered to treatment group members and, for the impact study, the potential services available to control group members. This is an essential part in an experimental design for understanding, for example, if employment outcomes at various points after random assignment are potentially associated with varying services received by treatment and control group members. If there are positive net impacts for the treatment group, it will be vital to understand the specific intervention(s) received by treatment group members so that they could potentially be replicated by other employment and training programs.

7. Special Data Collection Circumstances

This data collection effort does not involve any special circumstances.

8. Federal Register Notice and Consultations Outside the Agency

a. Federal Register Notice

The emergency notice soliciting comments on the proposed collection was published in the Federal Register on ________

b. Consultations Outside the Agency

Consultations with experts in the field on the research design, sample design, and data needs are part of the study design phase of the evaluation. The purposes of these consultations are to ensure the technical soundness of the study and the relevance of its findings and to verify the importance, relevance, and accessibility of the information sought in the study.

Peer Review Panel Members Consulted for the Impact Evaluation

1. Maureen Conway, maureen.conway@aspeninstitute.org

2. Harry J. Holzer, hjh4@georgetown.edu

3. Robert J. LaLonde, r-lalonde@uchicago.edu

4. Larry Orr, Larry.Orr.Consulting@gmail.com

5. Burt S. Barnow, barnow@gwu.edu

6. Mindy Feldbaum, mfeldbaum@aed.org

9. Respondent Payments

There are no payments to study participants for completing the site visits. We plan to pay participants in the focus groups $25 (using a gift card) for attending the focus groups. No payments will be made to any other respondents interviewed for the process study.

10. Confidentiality



Implementation Evaluation:

The administrators, staff and participants interviewed or participating in data collection activities (e. g. focus groups) by evaluators when on-site will be assured that their responses will be held in privacy to the maximum extent allowed by the law.  Individuals will be interviewed separately and in private offices.  All findings reported to ETA will be aggregate-level data. Similarly, interview notes or recordings will not be shared with ETA or anyone outside the study team. To preserve privacy, paper copies of interview notes and audio recordings will be secured in a locked file cabinet.  If any notes are recorded on laptop computers, such notes will be stored in a SQL Server database located in an access-controlled server room at the evaluation contractor.


All findings in any written materials or briefings, delivered to DOL, will be presented at the aggregate level and it will not be possible to link specific responses to individual respondents in any way. 

Impact Evaluation:

Abt Associates and Mathematica Policy Research have a strong set of methods in place to ensure the privacy and protection of all data collected from study participants. This includes policies and procedures related to privacy, physical and technical safeguards, and approaches to the treatment of personally identifiable information (PII).

  1. Privacy Policy

Abt and Mathematica are very cognizant of federal, state, and DOL data security requirements. All Abt and Mathematica study staff will comply with relevant policies related to secure data collection, data storage and access, and data dissemination and analysis. All staff working with PII will sign data security agreements. Abt’s and Mathematica’s security policies meet the legal requirements of The Privacy Act of 1974; the “Buckley Amendment,” Family Education and Privacy Act of 1974; the Freedom of Information Act; and related regulations to assure and maintain the privacy of program participants to the maximum extent allowed by the law.

  1. Privacy Safeguards

Process Study Site Visits. The administrators and staff interviewed by evaluators when on-site will be assured that their responses will be combined with those from other sites for analysis, will not be identified by the individual in any reports nor will interview notes be shared with ETA. Individuals will be interviewed separately and in private offices. (See protocol in Appendix A for the statement that will be used during site visits to assure respondents of privacy.) To preserve privacy, paper copies of interview notes will be secured in a locked file cabinet. If any notes are recorded on laptop computers, such notes will be stored in a SQL Server database located in an access-controlled server room at Abt Associates.


11. Questions of a Sensitive Nature

No sensitive questions will be asked during the site visits.

12. Estimates of Annualized Burden Hours and Costs

The hour burden estimate for the collection of information that is part of this clearance request consists of the site visits for each of the studies.


The Burden Estimates for the Implementation Study:

Estimates of Annualized Burden Hours and Costs


Table A.4 shows the estimated annualized burden hours for the respondents to participate in this study. In-depth interviews with grantee and partner staff will last about one hour each.


Approximately 360 focus group participants will be asked to complete a participant information sheet, complete a consent form, and participate in a focus group discussion. The information sheet and consent from will take about 20 minutes to complete and the focus group discussions will last about 1.5 hours for a total of 1.75 hours.


The total burden hours are estimated to be 920 hours.


Table A.4.  Estimated Annualized Burden Hours

Activity

Number of Respondents

Frequency of Response

Average Time per Respondent

Burden Hours

In-Depth Interviews:





a. Grant Administrator

36

Once

60 minutes

36

b. Education/Training Provider Partner

72

Once

60 minutes

72

c. Workforce Partner

36

Once

60 minutes

36

d. Employer/Union Partner

72

Once

60 minutes

72

e. Support Services/Other Partner

44

Once

60 minutes

44

Subtotal - Interviews

260

Focus Groups:





a. Participant Information Sheet

360

Once

15 minutes

90

b. Informed Consent

same

Once

5 minutes

30

c. Discussion

same

Once

90 minutes

540

Subtotal – Focus Groups

660

TOTAL

620

n/a


920

Table A.5 shows the estimated annualized cost burden based on the respondents' time to participate in the study. The total cost burden is estimated to be $17,259. This burden cost is partially offset by the $25 respondent payment for focus group participants.


The average hourly wage in that table for the site visit data collection is $18.76, based on the BLS average hourly earnings of production and nonsupervisory employees on private, service providing, nonfarm payrolls (September 2010 National Industry-Specific Occupational Employment and Wage Estimates, from the U.S. Department of Labor, Bureau of Labor Statistics and available on the department’s website).1


Table A.5. Estimated Annualized Cost Burden


Data Collection Activity

Total Burden Hours

Average Hourly Wage

Total Annualized Cost

Implementation Study Site Visits

920

$18.76

$17,259.2



The Burden Estimates for the Impact Evaluation:

We will interview an average of eight administrators and staff in each of the four sites included in the evaluation at two points over the course of the evaluation. In each site, we will also interview an average of three program partners and two employers. Finally, we will conduct one participant focus group with an average of eight students. The estimated response rate is 100 percent, since when arranging for the site visits, evaluators will confirm scheduled times for interviewing key administrators and staff and set up the focus group in advance. The estimated response time for the individual interviews is an average of 45 minutes and 90 minutes for the focus group. Total estimated response burden for the site visits is 126 hours. (Table A.6)

Table A.6. Burden Estimates for Impact Study Site Visits


Respondents

Number of Respondents/
Instances of Collection

Frequency of Collection

Average Time per Response

Burden

(Hours)

Site Visit Data Collection





Administrators & staff

32

Twice

45 minutes

48

Program partners

12

Twice

45minutes

18

Employers

8

Twice

45minutes

12

Treatment group students

32

Once

90minutes

48

Total for site visits

84

--

--

126



The total annualized cost to staff for the process study visits is presented below in Table A.7. The total estimated costs for these data collection activities are $2,364. The average hourly wage in that table for the site visit data collection is $18.76, based on the BLS average hourly earnings of production and nonsupervisory employees on private, service providing, nonfarm payrolls (September 2010 National Industry-Specific Occupational Employment and Wage Estimates, from the U.S. Department of Labor, Bureau of Labor Statistics and available on the department’s website).2

Table A.7. Total Annualized Cost Estimates for Site Visit Data Collection for the Impact Study


Data Collection Activity

Total Burden Hours

Average Hourly Wage

Total Annualized Cost

Process Study Site Visits

126

$18.76

$2,364



13. Estimates of Annualized Respondent Capital and Maintenance Costs


The proposed data collection for the on-site visits will not require respondents to purchase equipment or services or to establish new data retrieval mechanisms. There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection. The field-based implementation/process data collection involves semi-structured interviews discussing staff and administrators’ descriptions of services and service delivery, and their experiences, opinions, and factual information. Therefore, the cost to respondents solely involves the time involved in being interviewed. These costs are captured in the burden estimates provided in Item 12.


(a) We do not expect any total capital and start-up costs.

(b) We do not expect extensive time spent on generating, maintaining, and disclosing or providing the information.


14. Estimated Annualized Cost to the Federal Government

The cost to the Federal government for carrying-out the Green Jobs and Health Care Implementation Study data collection effort is $697,898, which represents the total contractor cost. These costs are comprised of $586,112 for staff labor, $76,165 for travel, and $35,621 in Other Direct/Indirect Costs. However, it is important to note that these figures are total costs for the entire evaluation and not just for the site visits, which are the subject of this submission.

Following is the annual cost for the entire Green Jobs and Health Care Impact Evaluation to the federal government. To average the annualized costs of these varying estimates over the next three years, there are two approaches: divide the five year total ($7,992,852) by five for an average annual cost of $1,598,570. Or, add the first three year totals and divide by 3, for an average of $1,389,910. However, it is important to note that these figures are total costs for the entire evaluation and not just for the site visits, which are the subject of this submission.

Table A.8. Annual Costs for GJ-HC Impact Evaluation

Year

Dates

Cost

1

2010-2011

$1,466,492

2

2011-2012

$1,012,994

3

2012-2013

$1,690,244

4

2013-2014

$1,997,998

5

2014-2015

$1,825,124

Total


$7,992,852



15. Changes in Burden

This is a new information collection request.

16. Publication Plans and Project Schedule

Implementation Evaluation:

The project began in October 2010 and will end in October 2012. The project design and interview/survey instruments were prepared in early 2011. The site visits and focus group discussions will begin in January 2012 and end in April 2012.


The tabulations and analyses of this data collection effort will be published in two rounds, in the form of an Interim Report delivered to ETA around the middle of the project, and a Final Report delivered to ETA at the end of the project. Each report will include a draft version that ETA will comment on, followed by a final version addressing the comments from ETA.


The Interim Report was delivered in December 2011 (final version). The Final Report will be delivered July 2012 (draft version) and September 2012 (final version).


The Final Report will include analysis of the site visit and focus group data collected from all 36 grantee sites visited during the course of the study.

Impact Evaluation:

Baseline data collection will begin in late spring 2011 after receipt of OMB approval. A 60-day public comment period, by means of a Federal Register Notice, will occur once this clearance package is submitted to OMB. In addition, two major project reports will be prepared: (1) the interim report, which will draw from 18-month follow-up data to present the key short-run findings of the impact analysis; and (3) the final report, which will utilize 18- and 36-month follow-up data to present findings on long-run program impact. At the conclusion of the study, the project will also create a public use data file stripped of personally identifiable information. Table A.6 gives the timeline for the deliverables.

Table A.9. Study Timeline



Time

Activity

Summer 2011

Baseline data collection begins

Winter 2011

First round of process study site visits conducted


Fall 2012

Second round of process study site visits conducted

Winter 2013

Baseline data collection ends;

18-month participant survey begins

Fall 2014

Interim report summary published

Summer 2015

Survey data collection ends;

36-month participant survey begins

Fall 2016

Final report summary published

Public use data file available




17. Reasons for Not Displaying Expiration Date of OMB Approval

The expiration date for OMB approval will be displayed on all forms associated with this data collection.

18. Exception to the Certification Statement

Exception to the certification statement is not requested for the data collection.


1U.S. Department of Labor, Bureau of Labor Statistics, Table B-8. Average hourly and weekly earnings of production and nonsupervisory employees on private nonfarm payrolls by industry sector, seasonally adjusted (accessed from the following website as of September 2010: http://www.bls.gov/webapps/legacy/cesbtab8.htm)

2U.S. Department of Labor, Bureau of Labor Statistics, Table B-8. Average hourly and weekly earnings of production and nonsupervisory employees on private nonfarm payrolls by industry sector, seasonally adjusted (accessed from the following website as of September 2010: http://www.bls.gov/webapps/legacy/cesbtab8.htm)


File Typeapplication/msword
File TitleEmergency submission; site visit data collection request for ARRA-funded grants; job training evaluations
AuthorSwick.Savi
Last Modified Bynaradzay.bonnie
File Modified2011-12-27
File Created2011-12-27

© 2024 OMB.report | Privacy Policy