Comments and responses to public comments

TRIO UB PD Survey-Response to Comments.docx

Study of Implementation and Outcomes in Upward Bound and other TRIO Programs

Comments and responses to public comments

OMB: 1850-0899

Document [docx]
Download: docx | pdf

Study of the Implementation and Outcomes in Upward Bound and Other TRIO Programs – (1958.01) #1850-NEW

Comment Response Summary

A total of 12 comments were received from the public during the 30-day notice on the collection. Please note that Thomas Nishi (comment 3) submitted 4 separate comments that have been consolidated and addressed by a single response. The final versions of the survey, which incorporates changes made in response to comments, is still consistent with burden estimates provided in Supporting Statement A.



  1. Name: jean public
    Address: not available, NJ

Email: jeanpublic1@yahoo.com
Submitter's Representative: member
Organization: american citizen
Government Agency Type: Federal

Comment:

i do not support this survey i oppose taking this survey. our govt has. grown much too large and too expensive. we need to cut down all these multitude of programs. its time to let parents spend time with their kids in upward bounding them. there are far far too many multitudes of these same programs and the cost is much too great

Response:

The comment received does not address the survey of Upward Bound project directors, therefore, no response can be provided.



  1. Name: Jerry Moore

Address: Bentonville, AR

Email: jhmoore@nwacc.edu

Submitter's Representative: NWACC Upward Bound

Organization: Upward Bound

Government Agency Type: State

Comment:

I hope that if we do not offer any of these service that it is seen in a negative manner. Each of our programs are different and outside of being Low Income and Frist Generation we must operate in a way that we can meet our objectives. Tutoring may work for a large group of programs while at some a tutoring is within the target school so the program does not have to offer it. How will a survey of this nature show how much we have changed the life of each of our students. Seeing a face reaction is more important than some check mark. 820 different views should not be collect into a single view of all 820 programs.

Response:



We recognize that projects aim to be responsive to the differing needs of the students they serve and, therefore, that implementation may vary substantially from project to project. Capturing this variation is the main goal of the survey, although certainly no questionnaire could document all of the differences. The intent of the survey is not to assess projects relative to another, but rather to obtain detailed qualitative information about what services projects are providing, and how they’re providing them. Projects will also have the opportunity to learn about how other projects are delivering services and potentially share promising practices to better serve participants.





  1. Name: Thomas Nishi

Address: Oakland, CA

Email: nishisensei@gmail.com

Submitter's Representative: Ms. Barbara Lee

Organization: Upward Bound, Mills College

Comments:

The survey of Upward Bound program nationwide is of limited usefulness as the regions in which the programs operate are diverse and may not be comparable. A program operating in a rural community is far different from a program operating in an urban environment in a large metropolitan city. Attempting to use a Gear Up strategy in an Upward Bound program setting may also be of very limited usefulness as the programs operate differently, and do not have the same goals.

Upward Bound programs are always tailored to the communities in which they operate, and are designed to meet the needs of the students in those communities. This is how they are effective and efficient. There is no reason to think that a strategy that works in one area of the US will work in another area of the US. Such strategies may be useful if modified and adapted to specific communities, but would not be easily applied in any situation.

Instead of conducting such a survey and attempting to implement common strategies throughout the country, successful ideas can be described, and programs may be able to address the same goals with appropriate strategies to the local environment.

In my experience from working in 4 different Upward Bound programs, I know it is impossible to simply implement a particular strategy that was successful in one locale to another. I would like to know in more detail how the department will use the informaotion gathered, and how programs will be able to respond to survey questions.

P. 1 Prior Experience criteria included new outcomes. However, the new objectives were not known when students were recruited, and student costs have declined. It is unfeasible to collect this data when student costs have declined, and current services are expected to be maintained. Adding additional data collection is unreasonable.

P. 6 Educational expectations and diagnosed learning disabilities were not collected at the time students were recruited. The information will be skewed if data is to be retrieved for those students now, as they have received services from projects, and so educational expectations will be different from when first recruited. Learning disabilities in some schools are not diagnosed, and requests for diagnosis from programs are typically rejected by schools.

Field 16: If this low income eligitilbity data was not collected in the past, attempting to find this data at the present time puts unreasonable workload on present staff. Since there is a decrease in budget, this additional workload is not reasonable.

Field 17: If first generation eligibility data was not collected in the past, attempting to gather this data now places present staff in an unreasonalble workload. Since the budget reductions, this is not reasonable.

Field 23: educational expectations was not collected from prior and current participants. Data collected now will be inaccurate since students have received services, and educational expectations have been changed since admission to Upward Bound. There is literally no way to accurately gather this data.

Field 25: disconnected youth instructions and definitions are unclear, so attempting to answer this question is not possible. Further explanation and instruction as well as deifinitions are required to respond.

Field 33: served by another federally funded college program data field does not have a way to indicate if a student was not served by another program. This will lead to skewed data that is not accurate.

Field 40: Rigorous secondary school program of study does not have instructions for prior participants. Instructions for appropriately answering the question is needed.

Field 57: school code for post secondary institutions first attended instructions are inadequate. If information is to be updated annually, what does this mean? The institution first attended will not change, so updates are not necessary.

Field 67: post secondary remediation information is not readily availble to Upward Bound, as such information is protected by FERPA. Individual contact with students subsequent to HS graduation is problematic, as students do not maintain current contact information with programs. All colleges do not offer remedial courses.

Field 67: post secondary remediation data is fraught with possible bias and should be removed. The purpose of the data is not significant to the objectives or for PE points.

Field 68: post secondary completion objective, numerator for 2013-2014. Programs funded for 2012-2017 do not have 2009 class cohort, and so cannot have PE points?

recommendation: The APR can be a useful tool, but only if instructions and definitions are well thought out, and written understandably. Data useful for UB projects should be included, but data not useful should not be collected. Workload for grantees should be recognized as budgets are deminished, and data collection should be adjusted accordingly.

Response:

We appreciate your interest in the Study of Implementation and Outcomes in Upward Bound and the comments you have provided.

We recognize that projects aim to be responsive to the differing needs of the students they serve and, therefore, that implementation may vary substantially from project to project. Capturing this variation is the main goal of the survey, although certainly no questionnaire could document all of the differences. The intent of the survey is not to assess projects relative to another, but rather to obtain detailed qualitative information about what services projects are providing, and how they’re providing them. Projects will also have the opportunity to learn about how other projects are delivering services and potentially share promising practices to better serve participants. More directly to the responder’s point, we will use the survey to examine how implementation varies across projects of different types, including their geographic location (region and urbanicity), host institution, longevity, size, and students served.

The remaining comments do not relate to the survey instrument for which this notice was posted and instead pertain to the Annual Performance Report (APR). Grantee workload and potential burden was taken into consideration during the development of the survey instrument and included in Supporting Statement Part B of this clearance package. The survey has been pretested with Upward Bound grantees and their feedback has further informed improvements to the survey to reduce burden.


  1. Name: Joy Brittain

Address: Monterey, CA,

Email: jbrittain@csumb.edu

Organization: Early Outreach and Support Programs


Comments:

  • Question 6-This question may be confusing because UB staff are normally employees of the institution or agency and are not employees of a high school or higher institution.

  • A5b-You may need to add in an additional line for evenings or add the words “and/or evenings” under day of week after school.

  • C1-Instructions mention delineating between grades, but there are no grade choices.

  • G6a1, G6b1- It would be advisable to give choices and not just one choice as well as add in amount of hours to these questions.

  • I9 Non-bridge students also take college-classes through UB. This should be added in as a question before question # I9.

  • J2- How about lack of funds!!!!

Response:

We appreciate your interest in the Study of Implementation and Outcomes in Upward Bound and the comments you have provided. The comments provided were very useful and pointed to areas where additional information and/or clarification should be provided. As a result of the editing processes the item numbers may have changed, but are cited here for reference. The following changes were made in response to item-specific comments:

  • Q6: The intent of this item is to lean about the previous occupational experiences of staff delivering Upward Bound services—additional clarification was added to address the comment and reduce confusion.

  • A5b: This item was restructured to allow for all potential program schedules.

  • C1: This language for this item was edited to align with the response options.

  • G6a1, G6b1: There is another question that allows projects to identify all methods of delivery that apply. The intent of this item is to learn how time is most often spent by students.

  • I9: A response options was added to capture non-bridge students taking college classes through UB in the referenced item.

  • J2: The response option was added to address this comment





  1. Name: Jeffrey Kahlden

Address: Weatherford, TX,

Email: jkahlden@wc.edu

Submitter's Representative: Jeffrey Kahlden

Organization: Weatherford College Upward Bound

Government Agency Type: Local

Government Agency: Weatherford College


Comments:

I am writing on behalf of Weatherford College Upward Bound. Being cognizant of the time frame, this survey can have portions combined to resolve the redundance of this survey on things such as technology.
Under Q6 in the Staff portion, the occupation of the UB staff seems to be a very vague question. I don't feel that I understand what you are asking regarding their occupation. Are you asking what their occupation is outside of Upward Bound?

Regarding the 8th grade participation, this is a very rare practice that I know of as far as having recent 8th grade applicants in the program. A large majority of the programs do not choose eighth grade students and I feel that you can eliminate any questions related to eighth grade participants for summer programs.

Additionally I think you need to follow the same format throughtout the survey in order to have some consistency. It teams that portion on Summer program is different than all of the other portions of the survey. I feel that you should also ask for a best stratgies question under each main section. I suggest for each of the main sections should have an open ended question that gives the UB programs the opportunity to discuss the practices and also highlight the innovative practices.

Finally, as it states in the Federal Register, this survey is 40 minutes in length. Please consider framing the questions that indicate additional research to add the language, on average. This will make it easier on the participant that must complete this survey.

I support the survey and hope that my comments are considered when creating the final version.



Response:

We appreciate your interest in the Study of Implementation and Outcomes in Upward Bound and the comments you have provided. The survey underwent extensive review to eliminate redundancies and reduce burden on respondents. The comments provided were very useful and pointed to areas where additional information and/or clarification should be provided. As a result of the editing processes the item numbers may have changed, but are cited here for reference. The following changes were made in response to item-specific comments:

  • The intent of item is Q6 is to lean about the previous occupational experiences of staff delivering Upward Bound services—additional clarification was added to address the comment and reduce confusion.

  • References to participants in the 8th grade have been removed.

  • The survey instrument was reviewed extensively to promote consistency in language and the types of questions asked across service areas. However, there is likely to be greater variation in some service areas than others, therefore, additional questions are necessary to be able to capture the differences in service delivery across projects.

  • We recognize that adding additional open-ended questions addressing best practices for each service area would allow project directors the opportunity to discuss a greater number of promising or innovative practices. While feedback from project directors on these strategies is invaluable, the number of open ended questions must be limited to reduce the burden on respondents—adding additional open ended questions would substantially increase the time necessary to complete the survey. Item K2 allows project directors to reflect on their project as a whole and highlight the one best strategy he/she believes is especially important in encouraging UB participants to enroll in college. No change was made in response to this comment.

  • The paper survey posted with this notice included the full set of questions that could potentially be asked—not every respondent will have to answer all of the questions. The web version of the survey will be programmed such that each respondent will only be asked questions that are relevant to their Upward Bound project. Also, we understand that exact responses to quantitative questions may be difficult and time consuming—we have added language allowing for best estimates to be provided.





  1. Name: Dan Benge

Address: Billings, MT,

Email: dbenge@msubillings.edu


Comments:

Q4. This question could be interpreted a number of ways. This could be confusing to folks. I had to take time to think about the response because my UB program staff and Talent Search staff are blended.


Q5. Before you ask this question you should provide definitions to each category (academic tutoring, academic coursework, academic advising, etc.). These could be interpreted in a variety of ways. Academic coursework for example, I do not offer academic coursework in the academic year however I do provide workshops that have academic elements. Is this just an academic workshop or course work? In the summer it is more easily defined.


Q6. This question makes no sense. It implies that all staff is something other than people hired to be TRiO employees. My professional staff members are hired to work in TRiO and provide service to students in all the service areas. During the summer is when I hire high school or college teachers to teach in my summer program or students to serve as Team Leaders in the residence hall.


A5. There needs to be a category: 4-5 days per week INCLUDING some weekends and evenings.


C2. Are you referring to courses students are advised to take in high school or referring to the courses in the summer program?



Some of the questions don’t make sense because some of the services differ by grade level. Question G6a1 for example how is time spent working with the largest number of students? The obvious answer for all students is #4, but with freshmen we may focus on things more related to question #1 and juniors on question #3. But we spend most of our time helping students track progress because the other questions are tasks to help students in question #4.


A number of questions asked is based on the assumption that all the programs are the same, which makes answering questions difficult. Because of the variance in program operations the responses are going to be extremely varied. Questions regarding the summer program for example.


It took me more than an hour and a half to complete the survey because the questions didn’t make sense or weren’t in line to what we are doing in our program. I had to think about how to respond to the questions because I want to provide the best feedback. The survey doesn’t capture all we do in our program.


Response:


We appreciate your interest in the Study of Implementation and Outcomes in Upward Bound and the comments you have provided. The comments provided were very useful and pointed to areas where additional information and/or clarification should be provided. As a result of the editing processes the item numbers may have changed, but are cited here for reference. The following changes were made in response to item-specific comments:

  • The intent of item Q4 is to learn about the staff providing Upward Bound services—additional clarification was added to address the comment and reduce confusion.

  • The intent of item Q5 is to learn about what type of UB staff are responsible for delivering the majority of services for a given service area. A definition of each service areas was added to the beginning of each section that is consistent with the definition provided in the UB grant application. Categorizing a project’s specific services under a service area is left to the discretion of the project director. You will have the opportunity to describe the format of non-traditional academic courses by choosing the “other” option in item C1 in the section on academic coursework, which directly address what is offered.

  • The intent of item is Q6 is to lean about the previous occupational experiences of staff delivering Upward Bound services—additional clarification was added to address the comment and reduce confusion.

  • Item A5 was restructured to allow for all potential program schedules.

  • The intent of item C2 is to learn about how students are advised when selecting what UB-offered courses should be taken as part of the UB program—additional clarification was added to address the comment and reduce confusion.

  • The intent of item G6a1 is to learn more about how college application assistance is delivered to participants and how that time is most commonly spent. While it’s likely these activities will vary depending on the participants’ grade level, asking for the information at that level of detail would only increase respondent burden. This item was simplified to address the comment and reduce confusion.


  1. Name: Cecilia Severin

Address: Jacksonville, FL,

Email: cseveri@ju.edu

Organization: Jacksonville University Upward Bound

Government Agency Type: Local

In Reference to: Agency Information Collection Activities; Submission to the Office of Management and Budget for Review and Approval; Comment Request; A Study of Implementation and Outcomes in Upward Bound and Other TRIO Programs (Document ID ED-2012-ICCD-0071-0004)

Q5. Summer Program is not a service area. During our summer program we have academic tutoring, academic coursework, academic advising, college entrance exam prep, etc. We also have: interactive exposure to the arts, sports and educational games; opportunities to practice leadership, sportsmanship, teamwork, and social skills; and motivational seminars, videos and speakers. My full-time staff, part-time staff, volunteers and tutors work with all of the students during the Summer Program. They work with different numbers of students in different “service areas”, but the staff, tutors and volunteers all work with the “largest number of students” at some points during the summer program.

Q6. Our tutors (college students), high school teacher’s, college professors, and other staff all work with the “largest number of students” at some points during the summer program.

A4. Our summer program services are offered in a number of service areas, but I cannot “Choose an item” under “Summer Program” because all of our students receive some services in the various service areas.

C1. I cannot adequately reflect the high school or college credits that some of our students receive in this section. All of our students are eligible for a high school credit in reading or writing during the summer.

Our students with a 3.5 or higher GPA in school are eligible to take a free college class at JU as part of our summer component. These college classes vary each summer depending on what is offered, but there is usually at least one of each of the following: math, science, history, computer and others.

C2. The factors we consider to determine what courses our participants take are different for the school year and for the summer. The plans for future course enrollments and students needs and interests are commonly used for the summer when we are planning for the next school year and when we have more class periods accommodate individual interests. During the academic year, graduation requirements, assessments and grades take precedence. Also, parents’ input is very important to us and is always considered, but I would not be able to classify it as a “common factor” because many of our parents rely on us to educate them about their students needs.

D3. I find it very difficult to answer the question at the bottom: “In which was is time spent most often by UB Staff”. While each member of the staff does a little of everything, we all specialize in certain areas. The academic advisor spends more of her time on advising the participants on college entrance requirements and study skills to support college readiness. The program coordinator spends more time on helping participants plan time spent on UB activities, tracking their goals and interest inventories. I spend more time advising participants and parents on high school graduation requirements, helping with non-academic issues and sharing all relevant information with staff, teachers, tutors and parents so we can work as a team to assure that everything is covered.

F5. I do not know what you mean by “Which method of delivery is used by the largest number of students for the following services?” Are you referring to – large group, small group, individual or – field trip, on campus, seminar, etc. or – on-line research, speakers, mentors, etc.?

J2. Summer Program has too many components to rate as a whole. There is no mention of available resources except for facilities.

Overall comment – There is no mention of the benefits and the challenges of a residential component.

Response:

We appreciate your interest in the Study of Implementation and Outcomes in Upward Bound and the comments you have provided. The comments provided were very useful and pointed to areas where additional information and/or clarification should be provided. As a result of the editing processes the item numbers may have changed, but are cited here for reference. The following changes were made in response to item-specific comments:

  • The summer program has been removed from items Q5, Q6, and J2. While the summer program is a required service, it is not mutually exclusive to the other required services areas. The summer program will be addressed as a separate question in each respective section.

  • The intent of item A4 is to learn about the location where the different services are provided—at the host institution, at the target school, at a local community center, etc.—not what services are provided. There are questions that specifically address whether services were offered during the summer program under sections on the required service areas.. No change was made in response to this comment.

  • The intent of item C1 is to learn about what academic coursework is offered to participants as a part of your Upward Bound project’s core curriculum. Coursework offered by the high school or host institution that students would be eligible for regardless of their UB participation should not be part of the response to this item. Project directors should only include courses that are provided by their Upward Bound projects, meaning they are funded using grant funds. Additional clarification was added to this item in response to comment. Details about student participation are addressed in a separate section and will be used in conjunction with information on the service areas to learn about how each project functions.

  • We recognize that factors taken into consideration to determine what courses UB participants should take are likely to differ from the school year to the summer program. Item C2 has been modified to allow respondents to select different factors for the school year and summer program.

  • The intent of item D3 is to learn about how participants, not staff, spent their time when receiving academic advising services—additional clarification was added to address the comment and reduce confusion.

  • The intent of item F5 is to learn more about what college exposure services are offered. The follow-up question about how these services are most often delivered has been removed to reduce confusion and burden.

  • The purpose of the survey is to learn about how grantees are delivering core services to participants. However, in order to minimize the burden on respondents, the survey focuses on required services. The implementation of a residential component (which is not required) is captured in the section on the summer component and can also be addressed in the concluding open-ended question on promising strategies. No change was made in response to this comment.





  1. Name: Frances Bennett

Address: Morgantown, WV

Email: fran.bennett@mail.wvu.edu

Organization: WVU Upward Bound

Government Agency Type: State

Government Agency: West Virginia University

Comments:

C1: Academic Coursework: While our UB program does offered academic courses during the summer, during the academic year the required academic instruction in math, foreign language, science, and composition and literature is not structured as a “class” in a particular subject. Students participate in interactive and interdisciplinary workshops and activities designed to increase their knowledge and skill-levels. There is no way to indicate that we provide academic services during the year in a different format.

Response:

We appreciate your interest in the Study of Implementation and Outcomes in Upward Bound and the comments you have provided. The comments provided were very useful and pointed to areas where additional information and/or clarification should be provided. As a result of the editing processes the item numbers may have changed, but are cited here for reference. The following changes were made in response to item-specific comments:

  • Item C1 allows respondents to make the distinction between academic courses offered during the school year and those offered during the summer. We understand that some projects implement academic coursework using different formats and now offer two “other” options that allow respondents to provide details about how their project delivers these services.



  1. Name: Halan Stanfield

Address: San Diego, CA,

Email: hstanfield@projects.sdsu.edu

Submitter's Representative: Susan Davis

Organization: San Diego/Imperial Valley TRIO Alliance


Comments:

The San Diego/Imperial Valley TRIO Alliance, a collaborative of nine institutions with 31

TRIO programs serving 10,067 students or adults per year, offers comments to the proposed Study of Implementation and Outcomes in Upward Bound and Other TRIO Programs, as published in the Federal Register on March 26, 2013.


According to Part A of the Supporting Statement for Paperwork Reduction Act Submission for the proposed study, the Department of Education offers the following as one of its objectives: “To help identify one or more strategies to test as part of a random assignment demonstration,” which are strategies that are “not already widely implemented but yet [are] sufficiently attractive to Upward Bound projects” (pg. 5). The proposed survey appears to serve as a precursor for a randomized study while respecting the statutory prohibition against denying student participants services the Upward

Bound program administers (the statutory prohibition appears in 20 U.S.C 1170a-13(h)). We offer the following comments and recommendations to increase the likelihood of the Department of Education being able to identify promising practices and strategies that are not widely implemented among Upward Bound projects.

- Redundancy and unpalatable decisions required of the respondent should be remedied to increase the likelihood of a 100% respondent rate.

- The strong risk of respondent error should be remedied to ensure collected data is valid.

- Questions concerning summer program offerings and project academic course offerings should have further detail to allow for disaggregation and cross-reference with related variables.

- Questions investigating the quality of project implementation, specifically the quality of key personnel, the implementation of the management plan, the quality of the instructors, and the implementation of an accountability system, should be included in the proposed survey.

- Questions investigating the rationale that informs implementation of service areas should be included in the proposed survey.

- Question K2, where the respondent may describe a particular strategy used within a specific service area, is too open-ended to be included in this proposed survey and should be replaced with focused questions.

- A link between the service areas being investigated and the project’s outcomes should be developed further.


Redundancy and unpalatable decisions required of the respondent should be remedied to increase the likelihood of a 100% respondent rate.


Questions F4 (“What kind of activities do students participate in during the visits?”) and F5 (“Please indicate if you offer any of the following services to your UB participants.”) are essentially the same questions asked in different ways, which is not appropriate for a survey design. Given the comprehensive nature of the proposed survey, redundancy only serves to frustrate the respondent, therefore decreasing the likelihood that the respondent completes the proposed survey.


Some questions in the proposed survey present the respondent with scenarios in which they must undesirably restrict responses. For example, C2 presents the respondent with a list of common factors that would inform course selection for the UB project, of which three may be chosen. Given the complexity surrounding course choice, not to mention that factors are not entirely translatable across different course subjects, attempting to select only three factors influencing course selection is a burdensome expectation for respondents. Ultimately, this will deter the respondent from further continuing the proposed survey if the responses project directors may offer are woefully inaccurate or reductive.


Even though this study is a “congressionally mandated study,” the Department cannot expect a 100% response rate as stated on page 14 of Part A of the Supporting Statement for Paperwork Reduction Act Submission. Project directors already have a host of program requirements that directly affect funding—the annual performance report being one of them—and the project directors would not make the proposed survey a high priority, especially if the survey design creates difficulty and extends completion time beyond the projected 40-minute completion time.


The strong risk of respondent error should be remedied to ensure collected data is valid.

A lack of clarity influences the quality of this proposed survey on more than one occasion. C1 asks the following: “First, please complete the table below to indicate what subjects are offered to which grades and when—by your Upward Bound project.” Although there is a statement that indicates that the question refers to the Upward Bound project’s offerings rather than the target school(s) offerings, it is not made clear enough to avoid an easily interpretive error: that the courses being reported are those to which students have access, regardless of whether Upward Bound offers the course or just the school district.


Redundancy not only diminishes the respondent rate but also contributes to respondent error. Any errors introduced in C1 would be introduced in C2, which concerns course selection: “Please indicate the three most common factors you consider to determine what courses UB participants take.” It is far too easy to think the question asks for courses UB participants will take while attending the target school, which further perpetuates the same inaccuracy. And if F4 and F5 ask essentially the same question, a mismatch between responses may occur, resulting in an increase of conflicting data.


Oversimplification and poor organization plagues one particular question concerning technology: J1. The question has grouped entirely different types of technology with entirely different functions into one large group: Specialized Software. Respondents could easily argue that all of the types of specialized software would apply to all of the service areas offered in the question, therefore resulting in a high number of respondents checking all service areas for that particular group. Also, the technology groups do not address function, and will require reclassification. For example, e-books are not an example of specialized technology.


The high risk for response error will make the data gathered from project directors who manage to complete the proposed survey unreliable and will not accurately represent real conditions within each program.


Questions concerning summer program offerings and project academic course offerings should have further detail to allow for disaggregation and cross-reference with related variables.

Questions Q5, Q6, F1, J1, and J2 request, in part, information about the project’s 2012 summer program. Summer programs often have two developmental components: cognitive (academics) and noncognitive (experiential, value-based, college culture exposure). Given these two aspects, the questions do not differentiate between what would apply to the cognitive aspects and what would apply to the noncognitive aspects. For example, questions Q5 and Q6, concerning Upward Bound project staff, would have two different responses for the summer program, where you may have fulltime staff for a residential experience and have intermittent staff for course offerings. This split would affect the validity of the answer, as the question response would not yield specific enough data.


Questions J1 and J2—concerning use of technology, and the challenges the project faces, respectively—suffer from a lack of division in the summer program column. The role technology plays in a classroom setting (a cognitive component) may differ greatly from that in a residential, field trip, or overall college environment (noncognitive), therefore allowing for a substantial amount of overlap when the two components are combined. Given this lack of division between cognitive and noncognitive elements, a respondent could indicate a majority of technology types are employed in the summer program, therefore making such data opaque. J2 has the same issue, as programs have made substantial cuts in noncognitive developmental activities to ensure required services are implemented, resulting in ambiguous responses as well.


On top of increasing respondent error, the relationship between C1 and C2 also limits the use of data, assuming that responses are indeed valid. When analyzing the data, one would not be able to tell which common factors would apply to which subject, therefore limiting the analysis.


Questions investigating the quality of project implementation, specifically the quality of key personnel, the implementation of the management plan, the quality of the instructors, and the implementation of an accountability system, should be included in the proposed survey.

Numerous questions asking about the method of delivery for various services do not offer much insight into the quality of these deliveries, particularly C1. The question asks what is the primary method of instruction for each course subject offered in the Upward Bound project, yet it does not ask about teacher qualifications for each subject. The only questions asking for staff qualifications are Q5 and Q6, which ask about the type of staff (paid full-time/half-time, intermittent, or unpaid volunteers) and the occupation of project staff (student, high school teacher, college professor, etc.). Even within the context of these two questions, there is room for additional reporting about qualifications: Do teachers have an appropriate background for a particular course’s subject material? Do they possess postgraduate degrees? Are undergraduate instructional assistants studying a subject relevant to their assigned course?


Questions attesting to the quality of key personnel and instructors, the implementation of the management plan, and the implementation of an accountability system are relevant not only to determine the efficacy of practice in a given service area, or the project in general, but also to identifying practices and strategies that are not widely implemented. Reporting on surface-level implementation does not yield sufficient data to determine rationale guiding the efficacy of a given service area.


Questions investigating the rationale that informs implementation of service areas should be included in the proposed survey.

Very few if any questions investigate rationale informing implementation of services, which is important data to collect to determine what is truly an effective practice as opposed to an unintended condition of the project.


Question F5, which asks respondents to indicate which college exposure services the project offers to participants, is confusing because the options do not constitute differences in “method of delivery”. Also, because there is no opportunity for the respondent to justify why the largest number of students appear in certain selections does not allow for understanding the causes of these choices. For example, although a project director may know that overnight college visitation has many more benefits than a day trip, budgetary considerations may cause the director to choose the less effective option. A similar issue occurs with question J1: the proposed survey does not possess a follow-up question investigating why the respondent has chosen specific types of technology rather than others.


Question C1 asks as part of the questions respondents to report on “whether students take courses to receive high school credit, college credit, or as a UB requirement.” The important point here is not whether or not the student gets credit but whether or not the Upward Bound project has been able to secure the option for students to receive credit. This is a quality control issue, in that in order to secure credit, the project must negotiate with the district(s) to ensure that the Upward Bound course offerings are in alignment with the district(s) offerings and that the courses are taught by credentialed teachers and meet state curriculum standards. In addition, what are the credits good for? There is a difference in programs that have secured high school or college credit as to whether those credits are simply electives, are remedial, or count for a-g requirements and/or high school graduation requirements. The option of Upward Bound students receiving high school or college credit (or both simultaneously) is a feature that may not be widely implemented because it takes so much work to secure the memoranda of understanding with sometimes multiple school districts.


It is especially important to identify what is guiding directors’ decisions, as some decisions may not be based on any decision whatsoever outside of financial limitations. A given practice or strategy involving robust college exposure may be considered “sufficiently attractive” to a project director, yet implementing such intensive service would not be feasible given that projects have endured level funding, greater participant requirements, additional reporting procedures, and sequestration. A choice may not be based on quality, but rather because of a lack of resources. This is especially true of projects that do not have access to numerous outside resources because they are rural programs or are hosted by organizations or institutions without available resources.


The structure of the proposed survey will make it difficult to determine unique and identifiable practices if questions do not investigate rationale. It also assumes that there are not restrictions on Upward Bound projects that affect the decisions directors make in implementing projects.


Question K2, where the respondent may describe a particular strategy used within a specific service area, is too open-ended to be included in this survey and should be replaced with focused questions. Question K2 asks the following of respondents: “Is there a particular strategy within a specific service area (optional or required) you believe is especially important in encouraging UB participants to enroll in college? If so, please identify the service area and provide a brief description and why you believe it is an especially promising strategy or approach.” This type of question does not provide for coded responses that yield sufficient results, or even enough information to build a profile on Upward Bound strategies. Questions like K2 are usually asked as a preliminary question to establish a series of more focused questions that can better investigate strategies and appropriately classify them. It is far too late in the development of this proposed survey to be asking this question of the estimated 820 respondents and expect an answer that yields sufficient results.


A link between the service areas being investigated and the project’s outcomes should be developed further.

It is difficult to discern which questions relate directly to project outcomes as stated in the Upward Bound program authorizing legislation. It is arguable that information derived from questions investigating direct services would inform what contributes to successful outcomes, such as A1c, which asks about GPA as a factor in making recommendations for service participation. However, since this proposed survey is used to collect information on direct services, it would behoove the Department to ensure that the required services concern the program’s outcomes and that questions are designed to inform what contributes to successful outcomes. Moreover, there is not a single question designed to address one of the outcomes: graduation and completion of a rigorous course of study.


One cannot expect to understand a link between strategies, practices and implementation if it is separated from the context of specific project outcomes. Questions probing the rationale project directors supply regarding each service may provide some insight as to what contributes to successful outcomes.


Recommendations:

The Department should convene an advisory board of experienced Upward Bound directors who are geographically representative, three to four national experts on survey construction, and members of the National Center for Educational Statistics who have extensive experience constructing surveys to work to advise the Department on what type of questioning might best yield the outcomes for which the Department is hoping. Representation from Mathematica is not required. Also, the proposed survey should be modified and re-conceptualized in the following ways:


a.) Reduce redundancy, confusing questions, and coerced responses;

b.) Include questions that investigate rationale of choices. For example: F5 should have a follow-up question asking, “Why did you choose the answer provided in F5?”;

c.) Ensure responses can be specific enough to yield data that can be used flexibly;

d.) Create a longer survey with the addressed redundancies and confusions rectified and requested items added, and then split the longer survey into two parts, to be implemented over time as opposed to once; and

f.) Use question K2 to develop more focused questions to implement in a future survey, and replace K2 with the focused questions.


The proposed survey requires substantial revisions informed by individuals with appropriate expertise and relevant field experience. As currently worded, the proposed Upward Bound Implementation and Outcomes survey is not meet minimum design standards to provide valid information regarding “promising” practices and strategies, let alone valid information regarding direct services themselves. If the above issues aren’t rectified according to the recommendations we have provided, there is a high likelihood that the planned randomized study on the effectiveness of Upward Bound will not be reliable or valid.


Response:


We appreciate your interest in the Study of Implementation and Outcomes in Upward Bound and the comments you have provided. The comments provided were very useful and pointed to areas where additional information and/or clarification should be provided. As a result of the editing processes the item numbers may have changed, but are cited here for reference. The following changes were made in response to item-specific comments:

  • Extensive revisions have been made to eliminate redundancies. The intent of item F4 is to learn about the activities participants engage in during college visits, while item F5 captures other college exposure services offered. Mention of college trips or visits was removed from item F5 in response to this comment.

  • We recognize that factors taken into consideration to determine what courses UB participants should take are likely to differ from the school year to the summer program. Item C2 has been modified to allow respondents to select different factors for the school year and summer program. We also understand there are a range of factors taken into consideration to determine what courses UB participants should take, and that these factors are likely to vary. However, based on the needs of the participants, there are likely to be some factors that take precedent and/or are more prevalent than others in determining academic coursework. These are the factors that should be included in responses to this item. We understand that service delivery is complex, however, we are limited in the level of detail that can be collected through a survey. To get more detailed information would require adding more items, which would increase the length of the survey and the burden on project directors. No change will be made to the number of responses allowed for item C2.

  • The intent of item C1 is to learn about what academic coursework is offered to participants as a part of your Upward Bound project’s core curriculum. Coursework offered by the high school or host institution that students would be eligible for regardless of their UB participation should not be part of the response to this item. Project directors should only include courses that are provided by their Upward Bound projects, meaning they are funded using grant funds. Additional clarification was added to this item in response to comment. Item J1 was revised to separate specialized software from electronic content to reduce confusion. Specialized software will remain a category. We recognize that different types of specialized software serve different functions and will be able to make inferences based on service area.

  • The summer program has been removed from items Q5, Q6, F1, J1, and J2. While the summer program is a required service, it is not mutually exclusive to the other required services areas. The summer program will be addressed as a separate question.

  • We appreciate that the quality of key personnel is important to service delivery and outcomes but we cannot capture that information adequately through a survey. We do ask about the previous employment experiences of staff in items Q5 and Q6, but including more items on staffing would add burden or require us to eliminate other more questions relating to implementation approach, , which is the focus of the survey.

  • The intent of item F5 is to learn more about what college exposure services are offered. The follow-up question about how these services are most often delivered has been removed to reduce confusion and burden.

  • Item K2 provides project directors the opportunity to highlight strategies and practices they believe is especially important in encouraging UB participants to enroll in college. While we considered a set of questions with categorical responses, feedback received from project directors through comments and pre-testing indicate that a majority of them believe an open ended question is essential to capturing nuanced information. We value the input of project directors and any insights to what worked for specifics projects.

Additional recommendations—not specific to any included items—were provided. Our responses are as follows:

  • The Department should convene an advisory board to develop the UB PD Survey.

During the development of the UB Project Directors’ Survey, we solicited and received input from individuals with expertise on college access. These individuals included Dr. William Tierney from the University of Southern California, Dr. Consuelo Arbona from University of Houston, and Dr. Laura Perna from the University of Pennsylvania. In revision of the survey, we also obtained advice from staff from ED’s Office of Postsecondary Education and addressed comments from members of the TRIO community, including the advocacy group representing Upward Bound and other TRIO programs. Five Upward Bound project directors participated in a pre-test of the survey, which also guided further revision of the instrument.

  • Include questions that investigate the rationale of choices.

The purpose of the survey is to help the Department obtain more current and detailed information on how different projects are providing core program services. We recognize that projects aim to be responsive to the differing needs of the students they serve and, therefore, vary substantially from project to project. While understanding the reasoning behind implementation of services would, indeed, be valuable, it would require an significant increase in burden and is beyond the scope of the survey.

  • Develop a link between service areas being investigated and the project’s outcomes.

The survey is collecting descriptive information about core program service delivery for descriptive purposes. Causal links cannot be made between the implementation data collected from the survey and project outcomes.

  • Create a longer survey to be implemented over time as opposed to at once.

A longer survey would substantially increase the burden on project directors, even if it were to be administered over time. Also, to implement the survey over time would not provide complete information on delivery of services during a single project year. While the survey is expected to take an average of 40 minutes to complete, the time period over which directors can work on the survey will be several weeks and they can easily stop and pick up again where they left off.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorU.S. Department of Education
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy