This submission requests clearance for four data collection activities to be conducted as part of the FBO Grant Access Study: (1) a survey of 250 FBOs that applied for DHHS discretionary grants in FY2006, (2) in-depth interviews with a subsample of 20 FBOs that participated in the survey, (3) a focus group with DHHS grant managers, and (4) a focus group with DHHS grant reviewers. In this section we describe the respondent universe and proposed sampling methods for each of these data collection activities.
We will draw our sample of FBOs to participate in the survey from the universe of FBOs that applied to one of 30 DHHS discretionary grant programs in FY2006, as documented in a database created by CFBCI. This database contains information on grant applicants, their applications, and outcomes, as reported to CFBCI by the DHHS Operating Divisions. Although the FY2006 database is not yet available, MPR has already reviewed the FY2005 database, which contains information on about 8,000 grant applications—including those from FBOs and non-FBOs—for the 30 discretionary grant programs. Of those, about 1,400 are applications received from FBOs. The database contains (1) the applicant’s name, city, and state; (2) the type of applicant (FBO or non-FBO); (3) the applicant DUNS number, if provided in the application; (4) the Operating Division offering the grant for which they are applying; (5) the grant name; and (6) whether or not the applicant received a grant.
From the universe of FY2006 FBO grant applicants, MPR will select for the survey a stratified random sample of FBOs. We will draw a sample of 294 FBO grant applicants and seek a response rate not less than 85 percent, for a total of 250 completed instruments. We will oversample FBOs that received grant awards. This allocation will allow comparisons between successful and unsuccessful applicants and will also provide a large enough sample to permit statements to be made about applicants as a whole. Therefore, before sample selection, we will stratify the sample frame on whether the application was successful, to ensure that each group is represented in the sample to the extent called for by the design.
We will also consider stratifying on other characteristics of FBO applicants available in the database, such as the DHHS Operating Division to which they applied, the grant program, or the location of the FBO (such as geographic region) so that the groups they define will be proportionately represented in the sample. Characteristics such as these may be related to success of the application, opinions about the process, or both. Before determining whether to do this additional stratification, we will examine the quality and completeness of the information that can be used to stratify applicants. We will also consider the relative sizes of such subgroups.
We will select for in-depth interviews a purposive sample of 20 FBOs that participated in the FBO survey. A purposive sample that includes FBOs with a range of selected characteristics deemed to be important for the study, rather than a representative sample of survey respondents, will be more useful for collecting specific data needed to ensure that we can address the study’s main research questions. We will review survey frequencies and emerging issues, such as potential grant access barriers or application strategies, to develop interviewee selection criteria in consultation with the APSE Task Order Monitor. Potential sample selection criteria include:
Whether or not applicant received grant award
Type of FBO
Type of social services provided or grant program applied for
Amount of funding sought
Experience providing social services prior to grant application
Organizational capacity (such as number of paid staff, volunteers, size of annual budget, number of members, training and experience of staff)
Types of barriers to obtaining grant funds identified in the survey instrument
Once criteria are identified, MPR will create cross-tabulations for all characteristics of interest and sort survey respondents into desired categories. If more than one observation meets all criteria for any desired set of characteristics, we will choose one or more respondents from that group. If any desired category is null, we will select an observation having the largest proportion of key characteristics for that category. We will select 35 potential respondents: an initial group of 20 representing the range of respondents we wish to interview, as well as a backup group of 15 who can be added to the sample if we cannot contract those in the initial sample, or in case they choose not to participate or are unable to complete the interview.
We will draw our sample of grant managers and reviewers from the universe of grant managers and reviewers that participated in selecting grant recipients for the DHHS discretionary grant programs included in the 2006 CFBCI database. We will use two overarching criteria to select focus group participants. First, we will seek participants representing a wide range of DHHS Operating Divisions, grant programs, experience, and personal backgrounds. Second, we will use evidence emerging from the study to prioritize such organizational and personal characteristics so that we include participants who can provide information on (1) the most influential factors determining grant outcomes, (2) program or grant areas deemed most important to the study, and (3) potential underlying or perceived barriers to grant access. Specific criteria could include:
DHHS Operating Division
Grant program area or specific program
Grant characteristics (amount, whether established or new program, other)
Grant history of division or program (such as proportion of awards made to FBOs and non-FBOs previously)
Length and/or variety of grant-making experience
Sector represents (such as federal staff, nonprofit staff, FBO staff)
Area of substantive expertise
Once specific selection criteria have been identified in consultation with ASPE, MPR will take several steps to identify possible focus group participants. We will request lists of (1) grant management staff, and (2) grant review panels that were active during the FY2006 grant period. We will also request information on the characteristics of interest for each person on the list, to the extent such information is known or available from agency records. We will sort these people based on selection criteria and work with the ASPE Task Order Monitor to select categories or specific people who can balance representativeness and particular knowledge of experience desired for the study.
As stated previously, we will draw a sample of 294 FBO grant applicants and seek a response rate not less than 85 percent, for a total of 250 completed surveys. Assuming that the ratio of successful to unsuccessful applicants in FY2006 is similar to that of FY2005, we expect that 148 cases will be allocated to unsuccessful applicants and 102 to the successful. Because the population from which the sample of successful applicants is selected is much smaller than the population of unsuccessful ones (just 20 percent of FBO applicants received grant awards in 2005), this allocation will result in roughly equal effective sample sizes for the two groups. For a 0/1 variable with an expected value of 50 percent, this sample will yield 95 percent confidence intervals of about ±7.5 percentage points around estimates of characteristics for each group and about ±6.14 percentage points for the sample as a whole (Table 4). It will also provide a high probability of detecting differences of 15 percentage points between the successful and unsuccessful applicants.1
TABLE 4
EXPECTED PRECISION
OF PROPOSED SAMPLE FOR THE FBO GRANT ACCESS STUDY
|
|
|
|
Half-Width 95 Percent Confidence Intervala for P = |
|
|
|
Population |
Sample |
Effective Sample |
50 Percent |
20 Percent |
Minimum |
Total Sample |
1,246 |
250 |
255.41 |
6.14 |
4.92 |
|
Unsuccessful Applicant |
997 |
148 |
173.80 |
7.46 |
5.96 |
|
Successful Applicant |
249 |
102 |
172.78 |
7.48 |
5.98 |
|
Other Subgroup (50 percent) |
623 |
125 |
127.70 |
8.71 |
6.96 |
|
Other Subgroup (35 percent) |
415 |
83 |
84.80 |
10.71 |
8.56 |
|
Contrast A-Bb |
|
|
|
10.56 |
|
15.08 |
Note: Proposed sample: stratified with oversampling of successful applicants.
a In percentage points.
b Contrast successful and unsuccessful applicants.
Survey administration will include several operations. First, MPR will draw the sample and obtain contact information for sample members. After survey interviewers are trained, MPR will release the sample to its Survey Operation Center in phases, with the timing dependent upon the rate at which interviewers actually make contact with respondents and complete interviews. MPR will send an advance letter that describes the study and its importance and encourages sample members to participate (Appendix G). The letter will be personalized and will stress the importance of individual participation for obtaining the most useful information possible for the study. It will also include a toll-free number that sample members can call to ask questions or to complete the survey. Advance letters will be mailed before each portion of the sample is released, so that calls can follow receipt of the survey notification. MPR will track survey contacts and completions and then follow up as necessary to obtain adequate response rates. An additional set of 100 sample members will be held in reserve and will be released as needed if response rates fail to meet 85%. Alternately, if response rates are higher than 85%, fewer sample points will be released. Should the response rate fall below 80%, we will conduct a nonresponse analysis. Finally, survey data will be entered into a data file from which analysis files can be created.
MPR will conduct 10 hours of project-specific training during which telephone interviewers will learn about the purposes of the study, planned uses of the data, and methods for gathering information. Training will include question-by-question instruction on the instrument, along with a discussion of commonly asked questions and approved responses. To ensure that all staff follow consistent procedures and do everything possible to achieve a high response rate, we will address possible challenges, such as potential difficulties identifying respondents. For quality control purposes, supervisors will carefully monitor interviewer performance during the course of the study, providing guidance and retraining as necessary. Special telecommunications equipment at the Survey Operation Center will allow supervisors to monitor live interviews.
As mentioned earlier, we will interview a subsample of 20 survey respondents in more depth to gain a better understanding of their experiences applying for grants in 2006. Prior to each in-depth, follow-up telephone interview, members of the study team will review the informant’s survey responses to develop a brief profile and to select or tailor particular interview questions based on survey responses. MPR will then send an advance letter that describes the study and its importance and encourages sample members to participate in the in-depth, follow-up phone interview (Appendix H). The letter will be personalized and will stress the importance of individual participation for obtaining the most useful information possible for the study. It will include a toll-free number that sample members can call to ask questions or to complete the interview. A senior researcher will then contact the respondent to schedule an interview, conduct the interview, and write up notes on the information collected, organized by discussion topic and research question. These writeups will be combined with the pre-interview profile for each informant.
Understanding the content and characteristics of successful grant applications, and how FBOs fare in the grant review process, is an important element of the study. To obtain this information, we will talk directly with those who manage and conduct grant reviews for HHS. We will hold two focus groups: one with grant managers, and one with people who have served on a grant review panel.
The focus group meetings will be conducted during working hours at a convenient location in Washington, DC. Each focus group recruit will be sent an advance letter that describes the study and its importance. The letter will include a toll-free number that sample members can call to ask questions (Appendix I). Recruits will also receive a follow-up letter thanking them for their willingness to participate and providing the information they will need to attend the group. Participants will be asked to arrive 10 minutes before the focus group begins, and to complete a short form asking for basic information (Appendix J). Use of this brief form will eliminate the need to ask focus group participants to provide information on their backgrounds during the focus group discussion.
One member of the research team will moderate the focus groups, and another will take notes. MPR will make a digital audio recording of all focus groups.
To maximize response rates for the telephone survey, we will use several strategies. As described below, these include (1) locating and contacting the most knowledgeable informant, (2) using tested survey items respondents can clearly understand and efficiently address, and (3) implementing proven sample recruitment and refusal avoidance procedures. In addition interviewers will be carefully trained in administering the survey items and in dealing with any potential obstacles that arise. Experienced senior staff members will supervise and monitor survey operations and step in when needed to help assure completion.
Contacting the right person is essential to achieving desired survey response rates. Contact information for sample members will be obtained from Operating Division grants management databases. These databases typically identify multiple representatives for each applicant, such as the grant writer, program director, and executive director. Therefore we have a range of people from which we can select the respondent most knowledgeable about the items in the survey. A screener at the beginning of the survey helps identify alternative respondents if the key informant is no longer with the organization or unavailable to answer the survey. Furthermore, during the survey there are several opportunities for respondents to provide name(s) of additional contact person(s) if the original respondent is unable to answer survey questions. In our pretest, we were able to identify and contact informants successfully, including obtaining contact information for individuals who had changed their position or location, and obtaining their cooperation. Specialized locating staff and resources will be utilized when necessary to boost our ability to contact sample members.
Having an effective and efficient survey instrument is a second key to maximizing the rate of survey completions. We have selected survey questions based not just on their relevance to the study, but also on their length, clarity, and directness. The questions included in the survey use plain, coherent, and unambiguous terminology. Many have been successfully administered as part of prior surveys. Sources of survey items include the Faith Communities Today (FACT) Survey conducted in 2000 and 2005, the 2002 Los Angeles Nonprofit Human Services Study, and the 2005 National Survey of Congregations. Items were also adapted from 2005 DHHS Staff Survey on Barriers to American Indian, Alaska Native, and Native American Communities Access to DHHS programs, conducted by ASPE. We have pretested the survey instruments, inviting questions and feedback from pretest respondents, and we subsequently revised questions and interviewer instructions to improve the ease of answering survey and eliminate overlap or duplication across survey items.
For a variety of reasons, completing the survey may be more challenging for some sample members. To contend with problems that arise, MPR’s Survey Operations Center has long experience using specialized staff and techniques to recruit sample members, and to convert incompletes and refusals to completed surveys. We mail advance letters and offer a toll-free 800 number for participants to call and either schedule or conduct their interview. Call backs are scheduled and made if respondents are called away from or interrupted during their first interview before being able to complete the survey. Followup letters are sent to sample members not reached within a limited time period or who do not complete the survey within a designated period. Experienced staff members make telephone and email contacts with those who may be busy or reluctant to participate, to encourage their participation and allay any concerns. Pretest respondents were enthusiastic about participating in the survey, as they felt that its topic was valuable to them, so we expect that many sample members will be highly motivated to participate in the survey.
Some sample members may not be eligible to participate in the survey. Grant applicants do not self-identify as FBOs, but instead have been identified as FBOs by HHS staff when they compile the database that serves as our sample frame. Therefore, before administering the survey to any sample member, we must identify whether the applicant’s organization fits within the study’s operational definition of an FBO or considers themselves to be an FBO (section B of the survey; if the organization is not an FBO the survey is immediately terminated and the organization is not part of our sample). To replace such ineligibles, when the survey sample of 294 applicants is drawn, we will also draw a supplemental sample of 100, which can be used to replace ineligibles.
If for any reason the telephone survey response rate falls below 80 percent, we will conduct a nonresponse analysis using information from the administrative data sources at our disposal. These data sources are quite rich; as described earlier the sampling frame alone includes the applicant’s geographic location, the HHS Operating Division and specific grant program to which they applied, whether or not they received grant awards, and the amount of their award. Additional data from Operating Division grants management databases to be used in our administrative data analysis will provide even further details. We will compare respondents and non-respondents across the dimensions available in the data and, if there are statistically significant differences, either make adjustments in our statistical analysis of the survey data to correct for bias, or disclose and discuss potential limitations of the analysis due to response-nonresponse differences, in the report.
To maximize the response rate for both the telephone survey and the in-depth follow-up phone interview, MPR will contact sample members at various times during the normal work day and ask them to schedule a time to complete the survey. For sample members who do not complete surveys within two weeks of our initial contact attempt, we will send a second letter explaining why participation in the study is important and asking the recipient to call our toll-free telephone number and complete the survey promptly. Sample members who still do not respond after receiving the follow-up letter will be re-contacted. MPR has staffed the project with personnel who possess the range of technical skills necessary to provide expert guidance to interviewers and respond to their questions. Staff include survey researchers, senior researchers, and a senior sampling statistician.
To maximize the response rate for the focus groups, MPR will provide each recruit with an advance letter and a follow-up letter thanking them for their interest and outlining all the relevant information they will need to attend the group. MPR will obtain permission from supervisors of grant managers and reviewers to participate during working hours and will provide a convenient location for the focus groups to meet. On the day before each focus group meets, we will place telephone calls to each recruit, politely reminding them of the day and time of the group, and asking them to contact MPR if they have an emergency and cannot attend.
MPR has pretested the phone survey with nine FBO respondents that vary along several dimensions. Pretest respondents were drawn from the database used to draw the survey sample, and were selected from the pool of applicants not included in the survey or reserve sample. MPR purposively selected pretest respondents who represented a range of FBO types and characteristics. Although many of the questions have been successfully administered as part of prior surveys, we used the pretest to assess respondent identification procedures, ease of administration, instruction clarity (such as skip patterns), adequacy of response categories, flow and order of questions, average interview length, and overall respondent burden. We revised the survey as necessary, based on the results of the pretests.
This study is being conducted by Mathematica Policy Research, Inc. (MPR), under contract to the Office of the Assistant Secretary for Planning and Evaluation (ASPE), U.S. Department of Health and Human Services. The project director is Ms. Debra A. Strong, the principal investigator is Ms. Diane Paulsell, and the survey director is Dr. Martha Bleeker—all MPR employees. The project team consulted with Dr. John Hall, senior statistician at MPR, about the sampling approach for this study. Ms. Wilma Tilson, ASPE Task Order Monitor, will receive, review, and approve all contract deliverables. Contact information is provided below.
Debra A. Strong, Mathematica Policy Research, Inc., 609-750-2001
Diane Paulsell, Mathematica Policy Research, Inc., 609-275-2297
Martha Bleeker, Mathematica Policy Research, Inc., 609-275-2269
John Hall, Mathematica Policy Research, Inc., 609-275-2357
Wilma Tilson, Office of the Assistant Secretary for Planning and Evaluation, 202-205-8841
1 In the table, we computed minimum detectable differences (MDDs) allowing for 80 percent power. In other words, there is an 80 percent probability of detecting a true difference at least as large as the MDD.
File Type | application/msword |
Author | DHHS |
Last Modified By | DHHS |
File Modified | 2007-05-18 |
File Created | 2007-05-18 |