Request for Clearance of Data Collection for the Evaluation of Historically Black Colleges and Universities – Undergraduate Program
This evaluation relies on different data collection methodologies depending on the instrument. The graduate survey is based upon a census of graduates of HBCU-UP institutions. The course revision form is also based on a census, but of the institutions participating in HBCU-UP. Because these are censuses of the populations of interest, no statistical methods are employed for data collection. The faculty survey, on the other hand, does require the use of statistical methods as it is based on a sample of faculty at HBCU-UP grantee institutions. Details are discussed below.
The
respondent universe and the sampling methods for each of the data
collection instruments are discussed below.
Graduate
Survey
The survey population for this component of the study is comprised of all HBCU-UP graduates with two or more years of “exposure” to program implementation, as shown in the table below. Therefore, the survey covers graduates of institutions in Cohorts 2 and 3. Graduates of Cohort 1 are not included because of the small size of the cohort and the difficulty of retrieving locater information on graduates of programs where funding expired in 2001. This decision is based on our experiences in retrieving locater information on graduates at other institutions of higher education. Also excluded are graduates of Cohorts 4, 5 and 6 because, at the time of survey administration, they will not have graduated after being exposed to the program for at least two years.
As
can be seen in the table below, 20 programs (14 from Cohort 2 and 6
from Cohort 3) have eligible graduates. The estimated aggregate
pool of graduates from the qualifying HBCU-UP programs to total about
5,000 based on an estimated average STEM graduating class size of
93. The entire population (5,000 graduates) will be
surveyed. The expected response rate of 60 – 75% will
yield at most 3750 completed surveys.
HBCU-UP Cohorts and Proposed Faculty Survey Coverage
A |
B |
C |
D |
E |
F |
HBCU-UP Cohort |
Academic Years |
# Awards |
Survey Inclusion (X) |
Graduation Years With 2+ Years Exposure |
# Years Since Graduation in 2006 |
1 |
1998-2001 |
3 |
-- |
-- |
-- |
2 |
1999-2004 |
14 |
X |
2001-2004 |
2-5 |
3 |
2001-2006 |
6 |
X |
2003-2004 |
2-3 |
4 |
2002-2007 |
5 |
-- |
-- |
-- |
5 |
2003-2008 |
6 |
-- |
-- |
-- |
6 |
2004-2009 |
21 |
-- |
-- |
-- |
|
Total |
55 |
|
|
|
Faculty
Survey
As can be seen in the table below, 20 programs (14 from Cohort 2 and 6 from Cohort 3) have eligible STEM Faculty. The aggregate pool of STEM faculty from the qualifying HBCU-UP programs is estimated to include 1,000, based on information available from the institutions’ websites. Just under half of the population (n = 450 faculty) will be sampled.
HBCU-UP Cohorts and Proposed Faculty Survey Coverage
A |
B |
C |
D |
HBCU-UP Cohort |
Academic Years |
# Awards |
Survey Inclusion (X) |
1 |
1998-2001 |
3 |
-- |
2 |
1999-2004 |
14 |
X |
3 |
2001-2006 |
6 |
X |
4 |
2002-2007 |
5 |
-- |
5 |
2003-2008 |
6 |
-- |
6 |
2004-2009 |
21 |
-- |
|
Total |
55 |
|
A
stratified probability sample of faculty will be selected. The
sampling frame will be compiled from lists of STEM faculty provided
by each of the 20 eligible institutions. Stratification will be
based a number of factors that we believe are associated with
institutional support and teaching, including institution, STEM
department, and faculty rank.
Since the sampling fraction is relatively high (approaching f = ½), we recognize that stratification will provide (at best) marginal enhancements of statistical precision over simple random sampling. Our plan is to draw a proportionate stratified sample of faculty into the survey. The proposed design is consistent with the goal of gauging a broad based “snapshot” of HBCU-UP progress.
Course Revisions Form
The short form (attached) inquiring about courses revised or developed as part of the HBCU-UP grant will be completed once by a representative of each institution that has received an HBCU-UP grant (N=55).
Graduate Survey
This component of the study is based on a survey of recipients of B.A. degrees from universities that had programs based on HBCU-UP grants. Each respondent will be asked to complete the survey only once.
The main limitation of this component of the study is that it depends heavily on obtaining a good response rate. Evaluators have extensive experience conducting surveys and will employ techniques that have proven helpful in the past in obtaining a high response rate (e.g., mailings to temporary and permanent addresses, and placing calls at different times of the day). In addition, bias analyses will be conducted to see if respondents to the survey differ in significant ways from non-respondents, and adjust the analyses accordingly.
Faculty Survey
This component of the study is based on a survey of STEM faculty at the HBCU-UP institutions awarded an HBCU-UP grant from NSF. Each respondent will be asked to complete the survey only once. In order to maximize response rates, a multi-mode data collection approach will be taken. Field protocols include an advance notification mailing, email notifications containing web URL and individual passwords, and a mail and telephone follow-up of no responders to the web/mail efforts. All field procedures will be integrated into a single sample management system to avoid duplication of effort, synchronize the mailing and calling schedules, and maintain the highest level of quality control.
One limitation of this study lies on the relatively small sample size. This means that limited subgroup analyses can be conducted, so the ability to discern differential levels of impact by, say, STEM department will be limited by the size of the department. The overall goals and objectives will be achieved under the current design, but available resources limit the degree with which small subgroup analyses can be conducted.
The second limitation of the faculty survey is that it depends heavily on obtaining a good response rate (which in turn affects the number of cases available for analysis). Evaluators have extensive experience conducting surveys and will employ techniques that have proven helpful in the past in obtaining a high response rate (e.g., mailings to university addresses, and placing calls at different times of the day). Moreover, our ongoing qualitative data collection suggests that faculty maintain a high degree of salience with regard to this program. Perceived salience is a key factor in triggering the decision to respond to a survey. Finally, bias analyses will be conducted to see if respondents to the survey differ in significant ways from non-respondents, and adjust the analyses accordingly.
As can be seen in the table below, an overall 82% response rate is expected under our multi-mode web/mail/CATI design. This stems from a screening response rate of 89% and an interview response rate of 92%. The overall response rate is the result of their product (i.e., 0.89 x 0.92 = 0.82).
Note that screening is necessary because we anticipate that the lists may contain individuals who are not actually STEM faculty (e.g., adjunct appointee, Teaching Assistant, faculty who left or are cross listed in another STEM department). We have allowed for a 5% ineligibility rate.
The expected disposition table below demonstrates how an 82% overall response rate will be achieved using web, mail and CATI data collections.
Expected Disposition for the HBCU-UP Faculty Survey
Screening for Eligibility |
|
|
|
|
|
Not screened |
|
|
|
|
|
5% |
Not eligible |
4% |
20 |
|
|
95% |
Eligible |
85% |
382 |
|
|
|
|
|
|
|
|
|
Interview Status |
|
|
382 |
|
|
|
Not interviewed |
8% |
32 |
|
|
|
Interviewed: |
WEB interview |
109 |
|
|
|
|
|
MAIL interview |
67 |
|
|
|
|
CATI interview |
175 |
|
|
|
|
||
|
Overall Response rate |
82% |
Course
Revisions Form
We do not anticipate facing major limitations with this data collection, as this form will be collected during a grantees’ meeting. Respondents will be informed in advanced of this data collection, and provided a copy of the form in advance, to ensure that they come to the meeting prepared to furnish evaluators with needed information.
Graduate Survey
This component of the study will be based on a census of HBCU-UP graduates with at least two years of program exposure. The sample is, therefore, the universe.
Faculty Survey
Strict probability sampling protocols will be employed for sample selection. We will select a proportionate stratified sample of faculty from compiled lists from the 20 HBCUs. Sampling will be without replacement. A sample of n = 450 will be drawn via systematic sampling from a sorted list of faculty. Primary sorting will be on institution (20 cells), followed by STEM department (number of cells will depend on disciplinary groupings), and rank (3 cells). Simulations of sample draws will be conducted to explore refinements to this stratification scheme before finalizing it.
Course Revisions Form
Census of all HBCU-UP grantee institutions.
Graduate Survey
Not applicable. The entire population of graduates will be surveyed.
Faculty Survey
Estimation procedures involve the calculation of weighted proportions, frequencies and means. Because of our sample stratification and our proposed post-stratification adjustments, our estimation of overall population parameters will implicitly use a stratified mean.
Many of our inferences will rely on tabular analyses and percentage point estimation and contrasts. Contrasts will be limited to larger subgroups comprising about half the population (e.g., gender comparisons). Analyses (e.g., tabulations) and individual survey estimates will be weighted to reflect two factors: differential nonresponse and post-stratification.
Note that a sampling weight will not be needed under a proportionate stratified sampling design because it is an equal probability sample design. Estimated totals (if desired) can be readily produced after post-stratification.
As discussed earlier, an exploration of potential nonresponse bias will be conducted to examine the correlates of nonresponse using factors/variables available for the entire sampling frame. Nonresponse bias will be suspected if such factors are found and are associated with our principal dependent variables/outcome measures. Nonresponse weight adjustments will be developed through our nonresponse analyses. This will reduce this source of bias.
Post stratification weight adjustments will be developed using the sampling frame to generate the known universe totals.
Course Revisions Form
Not applicable.
This study is of a correlational design and as such will not be represented to yield causal conclusions. The study involves three methods of data collection. Each respondent will provide answers to instruments only once during the year life of this study.
Graduate Survey
Not applicable.
Faculty Survey
The proposed design is consistent with the goal of gauging a broad based “snapshot” of HBCU-UP progress. Resulting sampling errors for overall estimates will be more than satisfactory for our purposes. Incorporating the finite population correction, the maximum sampling error for estimated percentages will be 2.4%, meaning that the half-widths of 95% confidence intervals for a percentage estimate would be no great than 4.7% (the max occurs when p=50%). And even subgroups comprising half the sample will have adequate statistical precision: maximum sampling errors of 3.4% for estimated percentages, and for contrasts, maximum sampling errors of 4.9%. For assessing the overall impact of the HBCU-UP program, this will be adequate precision for our point estimate and tabular analyses.
Course Revisions Form
Not applicable.
Graduate Survey
Not applicable.
Faculty Survey
Given the straightforward nature of the proposed list-based sample design, unusual problems are not anticipated. Non-coverage should not be a problem because of the contemporaneous nature of the lists being requested and the source of the lists (department heads). It is more likely that non-faculty or otherwise ineligible individuals will be included in the list, and we have made allowances for that in our survey planning. A third potential frame problem would be multiple appearances of the same faculty person (e.g., cross-listed in two or more STEM departments). But this too can be readily handled through an objective rule that assigns all persons to one unique list so that the multiple occurrences are treated as ‘blanks’.
Course Revisions Form
Not applicable.
In this study, data will only be collected once. In other words, the surveys will be administered once to each respondent.
In an effort to increase the response rate, the survey instruments will be multi-modal, offering options to respond by Internet, telephone, or mail. The survey package that is sent to respondents’ home (graduates) or university (faculty) addresses will include a paper copy of the survey, along with instructions on how to access both electronic and telephone versions of the survey. A stamped, self-addressed reply envelope is provided for those who wish to respond by mail. Those who wish to respond by telephone can do so via a toll-free line. To reduce non-participation, two weeks after the initial mailing of the survey package, a postcard will be sent to all nonrespondents requesting them to complete and return the survey. If no response is received four weeks after the initial mailing of the survey package, a replacement package will be sent. Six weeks after the initial mailing, telephone calls will be placed to all nonrespondents reminding them to complete and return the survey and offering them the opportunity to respond to the survey through that telephone call.
For the faculty survey, we will also engage the cooperation of STEM department heads and the HBCU-UP program director. We believe that through this protocol even the initially recalcitrant subjects, if any, will be persuaded to participate without much additional effort. In addition, for this survey, data collection will be conducting through the web (in a secure website) before paper versions are mailed as part of efforts to increase the response rate.
Field tests suggest that the graduate survey should take no more than fifteen minutes to complete, while pretests indicate that the faculty survey and the course revisions form should take about twenty minutes. Field tests were done of mail, web and CATI modes of administration, and subjects were de-briefed afterwards to examine flow, comprehension and usability of all materials and systems. The resultant survey data was reviewed to establish the integrity of system performance as well as output data.
Agency Unit:
Camille McKayle, Program Director HBCU-UP, National Science Foundation, 703.292.4671.
Jessie DeAro, former Program Director HBCU-UP (recently replaced by Marilyn Suiter), National Science Foundation, 703.292.5350.
Marilyn Suiter, Program Director for the HBCU-UP Targeted Infusion Projects and Planning Grants, National Science Foundation, 703.292.5121.
Elmima Johnson, Division of Research, Evaluation, and Communication, National Science Foundation, 703.292.5137.
Bernice Anderson, Office of the Assistant Director, National Science Foundation, EHR, 703.292.5151.
Contractor or Grantee:
Beatriz Chu Clewell, PI, Director, Evaluation Studies and Equity Research Program (PEER), The Urban Institute, 202.261.5617.
Clemencia Cosentino de Cohen, Co-PI, Research Associate, The Urban Institute, Evaluation Studies and Equity Research Program (PEER), 202.261.5409.
Robert Santos, Institute Senior Methodologist, The Urban Institute, 202.261.5904.
Julie Paasche, Senior Research Associate, Nustats, 512.306.9065; Subcontractor who will conduct the student survey.
The
Urban Institute will be responsible for data collection and analyses
under the direction of the PI and Co-PI of this evaluation, Beatriz
Chu Clewell (202.261.5617) and Clemencia Cosentino (202.261.5409).
NuStats, subcontractor to this project, will administer the surveys
and create a database for conducting analyses Julie Paasche;
jpaasche@nustats.com; 512.306.9065).
File Type | application/msword |
File Title | Supporting Statement (3145-0204) |
File Modified | 2007-10-11 |
File Created | 2007-10-11 |