Education response to public comments

1850-New Upward Bound IES Responses to Public Comments Revised.doc

Impact Evaluation of Upward Bound's Increased Focus on Higher-Risk Students - Baseline Data Collection Protocols

Education response to public comments

OMB: 1850-0822

Document [doc]
Download: doc | pdf

u. s. dEPARTMENT OF eDUCATION

institute of education sciences

National Center for Education Evaluation and regional assistance

to: RACHEL POTTER AND Brian Harris-Kojetin

from: jonathan jacobson and marsha silverberg

subject: response to public comments on proposed baseline data collection for “Impact evaluation of Upward Bound’s INCREASED FOCUS ON HIGHER RISK STUDENTS”

date: 2/6/2007, REVISED 2/27/2007

cc: KATHY AXT

We have received and reviewed public comments on the proposed data collection for the “Impact Evaluation of Upward Bound’s Increased Focus on Higher-Risk Students--Baseline Data Collection Protocols” (Federal Register, Vol. 71, No. 243, December 19, 2006, pp. 75952-75953). Because of the great similarity of many of the responses, we have grouped the questions and comments by category instead of individually. Our responses follow our restatement of each question or comment.

  1. Several commenters expressed concern that the timing of the Federal Register notice from late December to late January was intended to avoid a public comment period.

It was never ED’s intention to circumvent or preclude public comment on the design of and data collection for the Upward Bound evaluation. The evaluation activities described in the package for which public comment is now being sought were previously described in ED’s Notice of Proposed Priorities for the Upward Bound grant program (published July 3, 2006 in the Federal Register, Volume 71, Number 127, pp. 37926-37928) and a 60-day comment period was held at that time. Written responses to questions and comments were provided as part of the Notice of Final Priorities (published September 22, 2006 in the Federal Register, Volume 71, Number 184, pp. 55447-55450), and some Upward Bound grantees submitted questions and received responses preceding the grant application stage. Most importantly, before the Baseline Data Collection Protocols were submitted for public comment, staff from ED’s Institute of Education Sciences (IES) overseeing the Upward Bound evaluation and from the evaluation contractor gave presentations at all four regional TRIO conferences held during the fall. These presentations provided opportunities for dialogue with members of the TRIO community regarding the purpose and plans for the evaluation. As a result of these exchanges of views, several aspects of the planned evaluation design and data collection were modified.

  1. Many commenters argued that an extended comment period would be in the public interest.

The fact that the public comment period for the package began in late December and ends in late January reflects a balancing of two competing goals. ED’s interest in gathering input on the planned evaluation activities from the TRIO community led us to defer submitting the package until IES and relevant contractor staff could attend the TRIO conferences in the fall, thus delaying the start of the comment period. ED’s desire to alert grantees selected for the evaluation quickly, however, requires a short comment period and speedy approval from the Office of Management and Budget so that grantee selection can begin immediately. The Office of Postsecondary Education has specifically requested that the evaluation proceed according to a timetable that will generate an impact report in the year 2010, and this requires that grantees and students be included in the evaluation starting in 2007.


We are concerned that extending the public comment period for this data collection would have negative consequences for Upward Bound projects by delaying notification of grantees selected to be in the evaluation. As stated in the Notice of Final Priority, each of these grantees will be expected to recruit twice as many eligible new students in project year 2007-8 as the grantee plans to serve in its project. Until the comment period ends, we cannot give grantees notice of whether and how they will need to prepare in this manner to be included in the evaluation.


  1. A commenter asked whether grantees will be selected for the evaluation before funding is announced.

Our intention is to contact grantees in March 2007 to inform them that they have been selected to be in the evaluation, conditional on their receipt of funding, which we expect to be announced soon thereafter. We will then give priority to working on baseline data collection and random assignment for grantees needing to start their programs during the summer of 2007.

  1. Many commenters expressed a desire for grantees’ Institutional Review Boards (IRBs) to review, consider, and comment on the proposed evaluation and its protection of human subjects.

While ED is happy to provide information on the evaluation to individual grantee IRBs, obtaining approval from each grantee’s IRB is not necessary for the study to proceed or to be in compliance with federal Human Subjects regulations. The evaluation’s baseline data collection protocols have already been approved by the IRB of the evaluation contractor, Abt Associates, Inc., of Cambridge, Massachusetts. Moreover, the ED Office of General Counsel has ruled that, for multi-site evaluations and data collections conducted by IES, site-by-site IRB approval is not required because of the strength of the protections contained in the IES authorizing legislation (P.L. 107-279).

  1. Some commenters expressed concern that they had already offered admission to students recruited for the 2007-2008 academic year, and therefore could not cooperate with random assignment for these students.

The aforementioned Notice of Final Priority for the 2007 Upward Bound grant competition (9/22/2006) requires that grantees selected for the evaluation “refrain from admitting new students into the Upward Bound project for project year 2007-2008 until the evaluator has completed its data collection and random assignment for those students.” While grantees are permitted to screen eligible applicants for Upward Bound eligibility, a grantee selected for the evaluation that admits new students for 2007-2008 prior to data collection and random assignment would violate the terms of its 2007 grant and could be at risk of grant termination for failure to cooperate with the evaluation.


  1. Some commenters expressed concern that participating in a random assignment evaluation would harm their relationships with their communities and with families that have had other children participate in Upward Bound.

A total of 67 Upward Bound grantees participated in a previous random assignment evaluation of the program. We are not aware of any documented evidence that grantees’ relationships with their target schools or families in the community were harmed because random assignment lotteries were used to assign eligible applicants to openings over a two-year period (1992-1994). The new evaluation will give grantees the opportunity to identify students they would most like to admit, and these students will have 2:1 odds of being admitted to the program through the lottery (while other students will have 1:2 odds of admission). In this manner, grantees will be able to increase the likelihood of admitting certain students (such as siblings of current participants), who will be identified by the grantee prior to random assignment.

  1. Some commenters expressed objection to any kind of random assignment evaluation of Upward Bound or similar education programs, preferring a matched comparison group design such as that used to evaluate the Talent Search program.

Randomized controlled trials are generally recognized as the strongest research design for establishing the effectiveness of a program in producing intended outcomes (see the IES document, “Random Assignment in Program Evaluation and Intervention Research: Questions and Answers” [http://www.ed.gov/rschstat/eval/resources/randomqa.pdf], the OMB document, “What Constitutes Strong Evidence of a Program’s Effectiveness?” [http://www.whitehouse.gov/omb/part/2004_program_eval.pdf], and the recent article by Thomas D. Cook of Northwestern University, “Describing What is Special About the Role of Experiments in Contemporary Educational Research” [http://www.evaluation.wmich.edu/jmde/content/JMDE006content/PDFs_JMDE_006/Putting_the_Gold%20Standard_%20Rhetoric_into_Perspective.pdf]. Commenters’ references to limitations of experimental designs do not justify adoption of a less rigorous, matched comparison group design. Rather, the possible limitations of experimental evaluations need to be addressed during the design, analysis, and interpretative phases of such an evaluation. The new Upward Bound study design is intended to ensure both external and internal validity by sampling grantees to participate in the evaluation, and by gathering information on the receipt of services that may be responsible for particular experimental impact estimates. If the previous random assignment evaluation of Upward Bound were followed by an evaluation relying on a matched comparison group design instead of random assignment, it is difficult to see how the evidence from the new evaluation would be credible enough to overturn conclusions of the earlier evaluation.

  1. Some commenters were critical of the requirements of the 2007 grant competition, such as the requirement that grantees “Recruit at least twice as many eligible new students in project year 2007-2008 as the grantee plans to serve in its project.”

These criticisms, which were also made in response to the 6/3/2006 Notice of Proposed Priority, were addressed in the 9/22/2006 Federal Register notice publishing the Notice of Final Priority for the 2007 Upward Bound grants. Grantees have had considerable advance notice of the recruiting expectations for projects that will be selected to be in the evaluation.

  1. Some commenters criticized the consent and assent forms for their references to more students wanting to be in the program than spaces are available.

The Council for Opportunity in Education has estimated that, “Although 11 million Americans critically need to access the TRIO Programs, federal funding permits fewer than 7 percent of eligible youth and adults to be served.” [http://www.trioprograms.org/whatisTRIO_talkingpoints.html] The fact that, in the previous Upward Bound evaluation, 67 grantees recruited students equal (on average) to 187 percent of their program openings indicates that more students wanted to be in the programs at these sites than spaces were available.

  1. Commenters criticized the consent form and assent forms for not providing sufficient information on the consequences of going through the lottery and of not consenting to be in the evaluation. A particular concern was expressed about control group students not being eligible to re-apply for Upward Bound in the following year.

The consent and assent forms make clear that the lottery will determine who will be admitted to Upward Bound and who will not in evaluation sites for 2007-2008 (even for students not participating in the evaluation). The forms also make clear that students not admitted to Upward Bound through the lottery may still seek out other programs or services. We will add language to the consent and assent forms explaining that control group students will not be permitted to re-apply for Upward Bound at a later date.

  1. A commenter expressed concern about the student selection form requesting information on student eligibility and wondered whether ineligible students will be included in the evaluation.

Ineligible students will NOT be included in the evaluation and therefore will NOT be included in data collection or random assignment.

  1. Some commenters expressed concern about the redundancy between information gathered through the baseline survey or student selection form and information gathered through their own application process.

In order to have consistent information to compare students across all Upward Bound grantees included in the evaluation, we cannot rely on Upward Bound applications, which differ from project to project, but rather need to ask questions in the same way for all students participating in the evaluation. The extra time that will be needed for students and grantees to respond to these questions is included in our burden estimate.

  1. A commenter expressed concern about asking for a parent’s work phone number.

This question was included in the OMB-approved base year (10th grade) questionnaire of the Educational Longitudinal Study of 2002 as a means of increasing response rates for follow-up surveys. Gathering this information is common in studies that require data collection over multiple years.

  1. Commenters expressed concern about the schedule of data collection and reports, and the lack of focus on postsecondary outcomes.

Data collection on postsecondary outcomes for students in the evaluation is planned, but will not occur under the five-year period of performance for the initial evaluation contract. Rather, ED is likely to award a subsequent contract or contracts to investigate the impacts of Upward Bound on outcomes such as high school completion, college and financial aid applications, and college enrollment and credits and degrees earned.

  1. Commenters expressed concern about the uncertain burden of data collection.

The proposal for data collection includes detailed burden estimates and a discussion of plans the evaluator will take to minimize such a burden (for example, the hiring of site liaisons to assist with data collection).

File Typeapplication/msword
AuthorRachel Potter
Last Modified ByRachel Potter
File Modified2007-03-12
File Created2007-03-12

© 2024 OMB.report | Privacy Policy